datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
JovialValley/syllable_totalMapped4 | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 108772580
num_examples: 390
- name: test
num_bytes: 27386468
num_examples: 97
download_size: 137043673
dataset_size: 136159048
---
# Dataset Card for "syllable_totalMapped4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atipico1/nq-test-adv-replaced | ---
dataset_info:
features:
- name: question
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entities
dtype: string
- name: entities_count
dtype: int64
- name: adv_sent
dtype: string
- name: adv_passage
dtype: string
- name: hasanswer
dtype: bool
- name: is_adversarial
dtype: bool
splits:
- name: test
num_bytes: 58003429
num_examples: 3610
download_size: 33899916
dataset_size: 58003429
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Yanbin99/GITQA-Base-Pruned | ---
license: mit
---
|
joey234/mmlu-moral_scenarios-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 376684
num_examples: 895
download_size: 90881
dataset_size: 376684
---
# Dataset Card for "mmlu-moral_scenarios-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/6bf7f89d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1329
dataset_size: 184
---
# Dataset Card for "6bf7f89d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BAAI/COIG-PC | ---
language:
- zh
license: unknown
extra_gated_heading: Acknowledge license to accept the repository
extra_gated_prompt: "北京智源人工智能研究院(以下简称“我们”或“研究院”)通过BAAI DataHub(data.baai.ac.cn)和COIG-PC\
\ HuggingFace仓库(https://huggingface.co/datasets/BAAI/COIG-PC)向您提供开源数据集(以下或称“数据集”),您可通过下载的方式获取您所需的开源数据集,并在遵守各原始数据集使用规则前提下,基于学习、研究、商业等目的使用相关数据集。\n\
在您获取(包括但不限于访问、下载、复制、传播、使用等处理数据集的行为)开源数据集前,您应认真阅读并理解本《COIG-PC开源数据集使用须知与免责声明》(以下简称“本声明”)。一旦您获取开源数据集,无论您的获取方式为何,您的获取行为均将被视为对本声明全部内容的认可。\n\
1.\t平台的所有权与运营权\n您应充分了解并知悉,BAAI DataHub和COIG-PC HuggingFace仓库(包括当前版本及全部历史版本)的所有权与运营权归智源人工智能研究院所有,智源人工智能研究院对本平台/本工具及开源数据集开放计划拥有最终解释权和决定权。\n\
您知悉并理解,基于相关法律法规更新和完善以及我们需履行法律合规义务的客观变化,我们保留对本平台/本工具进行不定时更新、维护,或者中止乃至永久终止提供本平台/本工具服务的权利。我们将在合理时间内将可能发生前述情形通过公告或邮件等合理方式告知您,您应当及时做好相应的调整和安排,但我们不因发生前述任何情形对您造成的任何损失承担任何责任。\n\
2.\t开源数据集的权利主张\n为了便于您基于学习、研究、商业的目的开展数据集获取、使用等活动,我们对第三方原始数据集进行了必要的格式整合、数据清洗、标注、分类、注释等相关处理环节,形成可供本平台/本工具用户使用的开源数据集。\n\
您知悉并理解,我们不对开源数据集主张知识产权中的相关财产性权利,因此我们亦无相应义务对开源数据集可能存在的知识产权进行主动识别和保护,但这不意味着我们放弃开源数据集主张署名权、发表权、修改权和保护作品完整权(如有)等人身性权利。而原始数据集可能存在的知识产权及相应合法权益由原权利人享有。\n\
此外,向您开放和使用经合理编排、加工和处理后的开源数据集,并不意味着我们对原始数据集知识产权、信息内容等真实、准确或无争议的认可,您应当自行筛选、仔细甄别,使用经您选择的开源数据集。您知悉并同意,研究院对您自行选择使用的原始数据集不负有任何无缺陷或无瑕疵的承诺义务或担保责任。\n\
3.\t开源数据集的使用限制\n您使用数据集不得侵害我们或任何第三方的合法权益(包括但不限于著作权、专利权、商标权等知识产权与其他权益)。\n获取开源数据集后,您应确保对开源数据集的使用不超过原始数据集的权利人以公示或协议等形式明确规定的使用规则,包括原始数据的使用范围、目的和合法用途等。我们在此善意地提请您留意,如您对开源数据集的使用超出原始数据集的原定使用范围及用途,您可能面临侵犯原始数据集权利人的合法权益例如知识产权的风险,并可能承担相应的法律责任。\n\
4.\t个人信息保护\n基于技术限制及开源数据集的公益性质等客观原因,我们无法保证开源数据集中不包含任何个人信息,我们不对开源数据集中可能涉及的个人信息承担任何法律责任。\n\
如开源数据集涉及个人信息,我们不对您使用开源数据集可能涉及的任何个人信息处理行为承担法律责任。我们在此善意地提请您留意,您应依据《个人信息保护法》等相关法律法规的规定处理个人信息。\n\
为了维护信息主体的合法权益、履行可能适用的法律、行政法规的规定,如您在使用开源数据集的过程中发现涉及或者可能涉及个人信息的内容,应立即停止对数据集中涉及个人信息部分的使用,并及时通过“6.\
\ 投诉与通知”中载明的联系我们。\n5.\t信息内容管理\n我们不对开源数据集可能涉及的违法与不良信息承担任何法律责任。\n如您在使用开源数据集的过程中发现开源数据集涉及或者可能涉及任何违法与不良信息,您应立即停止对数据集中涉及违法与不良信息部分的使用,并及时通过“6.\
\ 投诉与通知”中载明的联系我们。\n6.\t投诉与通知\n如您认为开源数据集侵犯了您的合法权益,您可通过010-50955974联系我们,我们会及时依法处理您的主张与投诉。\n\
为了处理您的主张和投诉,我们可能需要您提供联系方式、侵权证明材料以及身份证明等材料。请注意,如果您恶意投诉或陈述失实,您将承担由此造成的全部法律责任(包括但不限于合理的费用赔偿等)。\n\
7.\t责任声明\n您理解并同意,基于开源数据集的性质,数据集中可能包含来自不同来源和贡献者的数据,其真实性、准确性、客观性等可能会有所差异,我们无法对任何数据集的可用性、可靠性等做出任何承诺。\n\
在任何情况下,我们不对开源数据集可能存在的个人信息侵权、违法与不良信息传播、知识产权侵权等任何风险承担任何法律责任。\n在任何情况下,我们不对您因开源数据集遭受的或与之相关的任何损失(包括但不限于直接损失、间接损失以及可得利益损失等)承担任何法律责任。\n\
8.\t其他\n开源数据集处于不断发展、变化的阶段,我们可能因业务发展、第三方合作、法律法规变动等原因更新、调整所提供的开源数据集范围,或中止、暂停、终止开源数据集提供业务。\n"
extra_gated_fields:
Name: text
Affiliation: text
Country: text
I agree to use this model for non-commercial use ONLY: checkbox
extra_gated_button_content: Acknowledge license
configs:
- config_name: default
data_files:
- split: full
path: data/full-*
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
- split: Top50PerTask
path: data/Top50PerTask-*
- split: Top100PerTask
path: data/Top100PerTask-*
- split: Top200PerTask
path: data/Top200PerTask-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: split
dtype: string
- name: task_name_in_eng
dtype: string
- name: task_type
struct:
- name: major
sequence: string
- name: minor
sequence: string
- name: domain
sequence: string
- name: other
dtype: string
- name: filename
dtype: string
splits:
- name: full
num_bytes: 198933665241
num_examples: 321332879
- name: train
num_bytes: 135575192364
num_examples: 208529583
- name: valid
num_bytes: 1703151331
num_examples: 2087767
- name: test
num_bytes: 5763748490
num_examples: 8094740
- name: Top50PerTask
num_bytes: 113823936
num_examples: 63643
- name: Top100PerTask
num_bytes: 222242916
num_examples: 127158
- name: Top200PerTask
num_bytes: 435753269
num_examples: 253558
download_size: 275132519
dataset_size: 342747577547
---
# COIG Prompt Collection
## License
**Default Licensing for Sub-Datasets Without Specific License Declaration**: In instances where sub-datasets within the COIG-PC Dataset do not have a specific license declaration, the Apache License 2.0 (Apache-2.0) will be the applicable licensing terms by default.
**Precedence of Declared Licensing for Sub-Datasets**: For any sub-dataset within the COIG-PC Dataset that has an explicitly declared license, the terms and conditions of the declared license shall take precedence and govern the usage of that particular sub-dataset.
Users and developers utilizing the COIG-PC Dataset must ensure compliance with the licensing terms as outlined above. It is imperative to review and adhere to the specified licensing conditions of each sub-dataset, as they may vary.
## What is COIG-PC?
The COIG-PC Dataset is a meticulously curated and comprehensive collection of Chinese tasks and data, designed to facilitate the fine-tuning and optimization of language models for Chinese natural language processing (NLP). The dataset aims to provide researchers and developers with a rich set of resources to improve the capabilities of language models in handling Chinese text, which can be utilized in various fields such as text generation, information extraction, sentiment analysis, machine translation, among others.
If you think COIG-PC is too huge, please refer to [COIG-PC-Lite](https://huggingface.co/datasets/BAAI/COIG-PC-Lite) which is a subset of COIG-PC with only 200 samples from each task file.
## Why COIG-PC?
The COIG-PC Dataset is an invaluable resource for the domain of natural language processing (NLP) for various compelling reasons:
**Addressing Language Complexity**: Chinese is known for its intricacy, with a vast array of characters and diverse grammatical structures. A specialized dataset like COIG-PC, which is tailored for the Chinese language, is essential to adequately address these complexities during model training.
**Comprehensive Data Aggregation**: The COIG-PC Dataset is a result of an extensive effort in integrating almost all available Chinese datasets in the market. This comprehensive aggregation makes it one of the most exhaustive collections for Chinese NLP.
**Data Deduplication and Normalization**: The COIG-PC Dataset underwent rigorous manual processing to eliminate duplicate data and perform normalization. This ensures that the dataset is free from redundancy, and the data is consistent and well-structured, making it more user-friendly and efficient for model training.
**Fine-tuning and Optimization**: The dataset’s instruction-based phrasing facilitates better fine-tuning and optimization of language models. This structure allows models to better understand and execute tasks, which is particularly beneficial in improving performance on unseen or novel tasks.
The COIG-PC Dataset, with its comprehensive aggregation, meticulous selection, deduplication, and normalization of data, stands as an unmatched resource for training and optimizing language models tailored for the Chinese language and culture. It addresses the unique challenges of Chinese language processing and serves as a catalyst for advancements in Chinese NLP.
## Who builds COIG-PC?
The bedrock of COIG-PC is anchored in the dataset furnished by stardust.ai, which comprises an aggregation of data collected from the Internet.
And COIG-PC is the result of a collaborative effort involving engineers and experts from over twenty distinguished universities both domestically and internationally. Due to space constraints, it is not feasible to list all of them; however, the following are a few notable institutions among the collaborators:
- Beijing Academy of Artificial Intelligence, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/baai.png" alt= “BAAI” height="100" width="150">
- Peking University, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/pku.png" alt= “PKU” height="100" width="200">
- The Hong Kong University of Science and Technology (HKUST), China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/hkust.png" alt= “HKUST” height="100" width="200">
- The University of Waterloo, Canada
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/waterloo.png" alt= “Waterloo” height="100" width="150">
- The University of Sheffield, United Kingdom
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/sheffield.png" alt= “Sheffield” height="100" width="200">
- Beijing University of Posts and Telecommunications, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/bupt.png" alt= “BUPT” height="100" width="200">
- [Multimodal Art Projection](https://huggingface.co/m-a-p)
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/map.png" alt= “M.A.P” height="100" width="200">
- stardust.ai, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/stardust.png" alt= “stardust.ai” height="100" width="200">
- LinkSoul.AI, China
<img src="https://huggingface.co/datasets/BAAI/COIG-PC/resolve/main/assets/linksoul.png" alt= “linksoul.ai” height="100" width="200">
For the detailed list of engineers involved in the creation and refinement of COIG-PC, please refer to the paper that will be published subsequently. This paper will provide in-depth information regarding the contributions and the specifics of the dataset’s development process.
## How to use COIG-PC?
COIG-PC is structured in a **.jsonl** file format. Each line in the file represents a single data record and is structured in JSON (JavaScript Object Notation) format. Below is a breakdown of the elements within each line:
**instruction**: This is a text string that provides the instruction for the task. For example, it might tell the model what to do with the input data.
**input**: This is the input data that the model needs to process. In the context of translation, it would be the text that needs to be translated.
**output**: This contains the expected output data after processing the input. In the context of translation, it would be the translated text.
**split**: Indicates the official split of the original dataset, which is used to categorize data for different phases of model training and evaluation. It can be 'train', 'test', 'valid', etc.
**task_type**: Contains major and minor categories for the dataset. Major categories are broader, while minor categories can be more specific subcategories.
**domain**: Indicates the domain or field to which the data belongs.
**other**: This field can contain additional information or metadata regarding the data record. If there is no additional information, it may be set to null.
### Example
Here is an example of how a line in the COIG-PC dataset might be structured:
```
{
"instruction": "请把下面的中文句子翻译成英文",
"input": "我爱你。",
"output": "I love you.",
"split": "train",
"task_type": {
"major": ["翻译"],
"minor": ["翻译", "中译英"]
},
"domain": ["通用"],
"other": null
}
```
In this example:
**instruction** tells the model to translate the following Chinese sentence into English.
**input** contains the Chinese text "我爱你" which means "I love you".
**output** contains the expected translation in English: "I love you".
**split** indicates that this data record is part of the training set.
**task_type** specifies that the major category is "Translation" and the minor categories are "Translation" and "Chinese to English".
**domain** specifies that this data record belongs to the general domain.
**other** is set to null as there is no additional information for this data record.
## Update: Oct. 8, 2023
- v1.3: Upload all splits to the main branch as arrow datasets. All jsonl files are stored in the raw_json branch now. Remove 152 task files. Add 10 task files. In total, 275 task files updated.
- v1.2: Delete 31 bad task files. Update 99 task files. Rename 2 task files. Add 3 new task files. COIG-PC now has 3339 tasks in total.
- v1.1: Fix 00040-001-000 and 00050-003-000, ignore 00930 and 01373.
- v1.0: First version for arXiv paper.
- v0.6: Upload 28 new tasks. COIG-PC now has 3367 tasks in total.
- v0.5: Upload 202 new tasks. COIG-PC now has 3339 tasks in total.
- v0.4: Upload 1049 new tasks. COIG-PC now has 3137 tasks in total.
- v0.3: Upload 1139 new tasks. COIG-PC now has 2088 tasks in total.
- v0.2: Upload 422 new tasks. COIG-PC now has 949 tasks in total. Add "TopSamplenumPerTask" split where only "Samplenum" samples are used from each task.
- v0.1: Upload 527 tasks.
## COIG-PC Citation
If you want to cite COIG-PC dataset, you could use this:
```
```
## Contact Us
To contact us feel free to create an Issue in this repository.
|
paraloq/json_data_extraction | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- json
- data-extraction
- structured-generation
- restricted-generation
- ecommerce
- medical
- manufacturing
- server logs
- news
pretty_name: Diverse Restricted JSON Data Extraction
size_categories:
- 1K<n<10K
---
# Diverse Restricted JSON Data Extraction
- **Curated by:** The [paraloq analytics](https://www.paraloq.ai) team.
## Uses
1. **Benchmark** restricted JSON data extraction (text + JSON schema -> JSON instance)
2. **Fine-Tune** data extraction model (text + JSON schema -> JSON instance)
3. **Fine-Tune** JSON schema Retrieval model (text -> retriever -> most adequate JSON schema)
### Out-of-Scope Use
Intended for research purposes only.
## Dataset Structure
The data comes with the following fields:
- **title**: The title of the schema.
- **topic**: The general topic of the item. For a list of topics, see below.
- **schema**: The JSON schema specifying the structure of the data.
- **item**: A JSON instance of the schema holding actual data.
- **medium**: The medium of the example data. Examples inlcude "news article", "blog post", "email", "html web page", "conversation", etc.
- **text**: An instance of the given medium, containing all the information held by the item, along with additional information.
A focus of this dataset is to provide a diverse set of items from a wide array of topics. We currently include the following topic areas:
- **simple**: Simple, general, documents such as to-do lists, calendars, recipes, etc. This is the most generic topic and is designed to be easy to exract.
- **medical**: Medical documents such as patient records, prescriptions, test results, etc.
- **ecommerce**: Ecommerce documents such as product listings, shopping carts, order confirmations, etc.
- **business**: Business documents such as invoices, purchase orders, contracts, etc.
- **travel**: Travel documents such as flight bookings, hotel reservations, itineraries, etc.
- **media**: Media documents such as movie reviews, music albums, video games, etc.
- **technology**: Technology documents such as software licenses, API responses, error logs, etc.
- **manufacturing**: Manufacturing documents such as product BOMs, work orders, inspection reports, COAs etc.
## Dataset Creation
### Curation Rationale
We use this dataset to benchmark different models for their ability to extract data from unstructured text in a zero shot fashion, by including the desired JSON schema in the prompt.
The dataset can also be used to fine-tune a model to extract data in a zero-shot manner, feeding text and a target JSON schema. Note that the difficulty here is typically not that the model output is not adhering to the desired JSON schema. This can be enforced by restricing generation using [guidance](https://github.com/guidance-ai/guidance) or [outlines](https://github.com/outlines-dev/outlines). For us, the issue is often that a model would not extract all of the available data.
### Source Data
This data is synthetically generated using Google's Gemini-Pro.
#### Data Collection and Processing
1. Prompt the model to generate a list of JSON schemas representing a diverse set of items.
2. Prompt the model to create instances from each of the schemas.
3. Prompt the model to generate text (in the form of a blog post, server logs, emails, chats, etc.) that contains the information held by the instance.
#### Who are the source data producers?
paraloq analytics is an Austrian AI research and development company based in Vienna.
## Bias, Risks, and Limitations
The data might include biases resulting from the sampling and bias propagation from Google's Gemini-Pro.
## Dataset Card Authors
Max Arrich
|
peterwz/wiki-filtered-1 | ---
dataset_info:
features:
- name: original
dtype: string
- name: summary
dtype: string
- name: compression_ratio
dtype: string
splits:
- name: train
num_bytes: 8198843
num_examples: 988
download_size: 1195739
dataset_size: 8198843
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edarchimbaud/revenue-estimate-stocks | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: string
- name: current_qtr
dtype: string
- name: no_of_analysts_current_qtr
dtype: int64
- name: next_qtr
dtype: string
- name: no_of_analysts_next_qtr
dtype: int64
- name: current_year
dtype: int64
- name: no_of_analysts_current_year
dtype: int64
- name: next_year
dtype: int64
- name: no_of_analysts_next_year
dtype: int64
- name: avg_estimate_current_qtr
dtype: string
- name: avg_estimate_next_qtr
dtype: string
- name: avg_estimate_current_year
dtype: string
- name: avg_estimate_next_year
dtype: string
- name: low_estimate_current_qtr
dtype: string
- name: low_estimate_next_qtr
dtype: string
- name: low_estimate_current_year
dtype: string
- name: low_estimate_next_year
dtype: string
- name: high_estimate_current_qtr
dtype: string
- name: high_estimate_next_qtr
dtype: string
- name: high_estimate_current_year
dtype: string
- name: high_estimate_next_year
dtype: string
- name: year_ago_sales_current_qtr
dtype: string
- name: year_ago_sales_next_qtr
dtype: string
- name: year_ago_sales_current_year
dtype: string
- name: year_ago_sales_next_year
dtype: string
- name: sales_growth_yearest_current_qtr
dtype: string
- name: sales_growth_yearest_next_qtr
dtype: string
- name: sales_growth_yearest_current_year
dtype: string
- name: sales_growth_yearest_next_year
dtype: string
splits:
- name: train
num_bytes: 5577663
num_examples: 19712
download_size: 737316
dataset_size: 5577663
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "revenue-estimate-sp500"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://edarchimbaud.substack.com
- **Repository:** https://github.com/edarchimbaud
- **Point of Contact:** contact@edarchimbaud.com
### Dataset Summary
The revenue-estimate-sp500 dataset provides revenue estimate data for companies in the S&P 500 index.
### Supported Tasks and Leaderboards
The dataset can be used to analyze and predict revenue estimates for companies in the S&P 500 index.
## Dataset Structure
### Data Instances
[N/A]
### Data Fields
- symbol (string): A string representing the ticker symbol or abbreviation used to identify the company.
- date (string): A string indicating the date of the recorded data.
- current_qtr (string): A string representing the current quarter.
- no_of_analysts_current_qtr (int64): An integer indicating the number of analysts providing estimates for the current quarter.
- next_qtr (string): A string representing the next quarter.
- no_of_analysts_next_qtr (int64): An integer indicating the number of analysts providing estimates for the next quarter.
- current_year (int64): An integer indicating the current year.
- no_of_analysts_current_year (int64): An integer indicating the number of analysts providing estimates for the current year.
- next_year (int64): An integer indicating the next year.
- no_of_analysts_next_year (int64): An integer indicating the number of analysts providing estimates for the next year.
- avg_estimate_current_qtr (string): A string representing the average estimate for the current quarter.
- avg_estimate_next_qtr (string): A string representing the average estimate for the next quarter.
- avg_estimate_current_year (string): A string representing the average estimate for the current year.
- avg_estimate_next_year (string): A string representing the average estimate for the next year.
- low_estimate_current_qtr (string): A string representing the low estimate for the current quarter.
- low_estimate_next_qtr (string): A string representing the low estimate for the next quarter.
- low_estimate_current_year (string): A string representing the low estimate for the current year.
- low_estimate_next_year (string): A string representing the low estimate for the next year.
- high_estimate_current_qtr (string): A string representing the high estimate for the current quarter.
- high_estimate_next_qtr (string): A string representing the high estimate for the next quarter.
- high_estimate_current_year (string): A string representing the high estimate for the current year.
- high_estimate_next_year (string): A string representing the high estimate for the next year.
- year_ago_sales_current_qtr (string): A string representing the year-ago sales for the current quarter.
- year_ago_sales_next_qtr (string): A string representing the year-ago sales for the next quarter.
- year_ago_sales_current_year (string): A string representing the year-ago sales for the current year.
- year_ago_sales_next_year (string): A string representing the year-ago sales for the next year.
- sales_growth_yearest_current_qtr (string): A string representing the sales growth estimate for the current quarter.
- sales_growth_yearest_next_qtr (string): A string representing the sales growth estimate for the next quarter.
- sales_growth_yearest_current_year (string): A string representing the sales growth estimate for the current year.
- sales_growth_yearest_next_year (string): A string representing the sales growth estimate for the next year.
### Data Splits
A single split, called train.
## Dataset Creation
### Curation Rationale
The revenue-estimate-sp500 dataset was created to provide revenue estimate data for companies in the S&P 500 index.
### Source Data
The data was collected and normalized from reliable sources.
## Additional Information
### Dataset Curators
The revenue-estimate-sp500 dataset was collected by https://edarchimbaud.substack.com.
### Licensing Information
The revenue-estimate-sp500 dataset is licensed under the MIT License.
### Citation Information
> https://edarchimbaud.substack.com, revenue-estimate-sp500 dataset, GitHub repository, https://github.com/edarchimbaud
### Contributions
Thanks to [@edarchimbaud](https://github.com/edarchimbaud) for adding this dataset. |
ylacombe/libritts-r-descriptions-10k-v2 | ---
dataset_info:
- config_name: clean
features:
- name: text
dtype: string
- name: text_original
dtype: string
- name: speaker_id
dtype: string
- name: path
dtype: string
- name: chapter_id
dtype: string
- name: id
dtype: string
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: snr
dtype: float32
- name: c50
dtype: float32
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
- name: text_description
dtype: string
splits:
- name: dev.clean
num_bytes: 5077218
num_examples: 5736
- name: test.clean
num_bytes: 4445816
num_examples: 4837
- name: train.clean.100
num_bytes: 29622589
num_examples: 33232
- name: train.clean.360
num_bytes: 104765713
num_examples: 116426
download_size: 50974949
dataset_size: 143911336
- config_name: other
features:
- name: text
dtype: string
- name: text_original
dtype: string
- name: speaker_id
dtype: string
- name: path
dtype: string
- name: chapter_id
dtype: string
- name: id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
- name: text_description
dtype: string
splits:
- name: dev.other
num_bytes: 3941948
num_examples: 4613
- name: test.other
num_bytes: 4293063
num_examples: 5120
- name: train.other.500
num_bytes: 180086108
num_examples: 205035
download_size: 65220249
dataset_size: 188321119
configs:
- config_name: clean
data_files:
- split: dev.clean
path: clean/dev.clean-*
- split: test.clean
path: clean/test.clean-*
- split: train.clean.100
path: clean/train.clean.100-*
- split: train.clean.360
path: clean/train.clean.360-*
- config_name: other
data_files:
- split: dev.other
path: other/dev.other-*
- split: test.other
path: other/test.other-*
- split: train.other.500
path: other/train.other.500-*
---
|
bcui19/UC-first-turn-raj-tokenizer | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 532857739
num_examples: 207865
download_size: 308545981
dataset_size: 532857739
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "UC-first-turn-raj-tokenizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigbio/pdr |
---
language:
- en
bigbio_language:
- English
license: unknown
multilinguality: monolingual
bigbio_license_shortname: UNKNOWN
pretty_name: PDR
homepage: http://gcancer.org/pdr/
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- EVENT_EXTRACTION
- COREFERENCE_RESOLUTION
---
# Dataset Card for PDR
## Dataset Description
- **Homepage:** http://gcancer.org/pdr/
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER,EE,COREF
The corpus of plant-disease relation consists of plants and diseases and their relation to PubMed abstract.
The corpus consists of about 2400 plant and disease entities and 300 annotated relations from 179 abstracts.
## Citation Information
```
@article{kim2019corpus,
title={A corpus of plant--disease relations in the biomedical domain},
author={Kim, Baeksoo and Choi, Wonjun and Lee, Hyunju},
journal={PLoS One},
volume={14},
number={8},
pages={e0221582},
year={2019},
publisher={Public Library of Science San Francisco, CA USA}
}
```
|
Sleoruiz/disc_cla_septima | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 39746420
num_examples: 9432
download_size: 20745223
dataset_size: 39746420
---
# Dataset Card for "disc_cla_septima"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
apollo-research/roneneldan-TinyStories-tokenizer-gpt2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 1897521336.0
num_examples: 924718
- name: validation
num_bytes: 19077444.0
num_examples: 9297
download_size: 796489454
dataset_size: 1916598780.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
alvarobartt/gists | ---
license: mit
tags:
- code
size_categories:
- n<1K
---
# Gists
This 🤗 dataset contains some of my GitHub Gists at https://gist.github.com/alvarobartt, ported here so that its cleaner
and easier to maintain.
## Available gists
* `causallm-to-hub.py`: to upload any `AutoModelForCausalLM` to the 🤗 Hub from a local path, useful after some LLM fine-tuning,
as sometimes `accelerate` gets stuck while pushing to the Hub, so I tend to do that in a separate process after each epoch has been
dumped into the disk.
* `dpo-qlora-4bit.py`: to fine-tune an `AutoModelForCausalLM` using Q-LoRA in 4-bit, in this case the fine-tuning is done using
🤗 `trl.DPOTrainer` built on top of `transformers` useful for intent alignment of LMs on low resources, ~80GB of VRAM. |
llm-aes/toy | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: worker_id
dtype: string
- name: human_label
dtype: int64
- name: llm_label
dtype: int64
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: premise
dtype: string
splits:
- name: train
num_bytes: 553962
num_examples: 240
download_size: 21883
dataset_size: 553962
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aimankem32/races | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_pandego__my-first-blend | ---
pretty_name: Evaluation run of pandego/my-first-blend
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pandego/my-first-blend](https://huggingface.co/pandego/my-first-blend) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pandego__my-first-blend\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T20:15:27.587936](https://huggingface.co/datasets/open-llm-leaderboard/details_pandego__my-first-blend/blob/main/results_2024-04-02T20-15-27.587936.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5418796677743002,\n\
\ \"acc_stderr\": 0.03412034385148196,\n \"acc_norm\": 0.5466283724575167,\n\
\ \"acc_norm_stderr\": 0.03485454406456831,\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7070008802879916,\n\
\ \"mc2_stderr\": 0.015219455818404188\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145687,\n\
\ \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276514\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6236805417247561,\n\
\ \"acc_stderr\": 0.004834715814208111,\n \"acc_norm\": 0.8303126867157936,\n\
\ \"acc_norm_stderr\": 0.0037459074237767096\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336285,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336285\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.0248708152510571,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.0248708152510571\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957546,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957546\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803624,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803624\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095933,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095933\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.032363611119519416,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.032363611119519416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119996,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119996\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074,\n \"acc_norm\"\
: 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.03460228327239172,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.03460228327239172\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n\
\ \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.038470214204560226,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.038470214204560226\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891822,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891822\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562429,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562429\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7254150702426565,\n\
\ \"acc_stderr\": 0.015959829933084035,\n \"acc_norm\": 0.7254150702426565,\n\
\ \"acc_norm_stderr\": 0.015959829933084035\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.026613350840261743,\n\
\ \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.026613350840261743\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\
\ \"acc_stderr\": 0.028125340983972708,\n \"acc_norm\": 0.5691318327974276,\n\
\ \"acc_norm_stderr\": 0.028125340983972708\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.02766713856942271,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.02766713856942271\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596157,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596157\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3859191655801825,\n\
\ \"acc_stderr\": 0.01243339891147614,\n \"acc_norm\": 0.3859191655801825,\n\
\ \"acc_norm_stderr\": 0.01243339891147614\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485697,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485697\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307293,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307293\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.031642094879429414,\n\
\ \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.031642094879429414\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7070008802879916,\n\
\ \"mc2_stderr\": 0.015219455818404188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235802\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2562547384382108,\n \
\ \"acc_stderr\": 0.012025145867332844\n }\n}\n```"
repo_url: https://huggingface.co/pandego/my-first-blend
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|arc:challenge|25_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|arc:challenge|25_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|gsm8k|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|gsm8k|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hellaswag|10_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hellaswag|10_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-06-48.500018.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-15-27.587936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T20-15-27.587936.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- '**/details_harness|winogrande|5_2024-04-02T20-06-48.500018.parquet'
- split: 2024_04_02T20_15_27.587936
path:
- '**/details_harness|winogrande|5_2024-04-02T20-15-27.587936.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T20-15-27.587936.parquet'
- config_name: results
data_files:
- split: 2024_04_02T20_06_48.500018
path:
- results_2024-04-02T20-06-48.500018.parquet
- split: 2024_04_02T20_15_27.587936
path:
- results_2024-04-02T20-15-27.587936.parquet
- split: latest
path:
- results_2024-04-02T20-15-27.587936.parquet
---
# Dataset Card for Evaluation run of pandego/my-first-blend
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pandego/my-first-blend](https://huggingface.co/pandego/my-first-blend) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pandego__my-first-blend",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T20:15:27.587936](https://huggingface.co/datasets/open-llm-leaderboard/details_pandego__my-first-blend/blob/main/results_2024-04-02T20-15-27.587936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5418796677743002,
"acc_stderr": 0.03412034385148196,
"acc_norm": 0.5466283724575167,
"acc_norm_stderr": 0.03485454406456831,
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7070008802879916,
"mc2_stderr": 0.015219455818404188
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145687,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276514
},
"harness|hellaswag|10": {
"acc": 0.6236805417247561,
"acc_stderr": 0.004834715814208111,
"acc_norm": 0.8303126867157936,
"acc_norm_stderr": 0.0037459074237767096
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336285,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336285
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.0248708152510571,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.0248708152510571
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957546,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957546
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803624,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803624
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095933,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095933
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.032363611119519416,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.032363611119519416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119996,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119996
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239172,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239172
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.038470214204560226,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.038470214204560226
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891822,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891822
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7254150702426565,
"acc_stderr": 0.015959829933084035,
"acc_norm": 0.7254150702426565,
"acc_norm_stderr": 0.015959829933084035
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5751445086705202,
"acc_stderr": 0.026613350840261743,
"acc_norm": 0.5751445086705202,
"acc_norm_stderr": 0.026613350840261743
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913048,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913048
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972708,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972708
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.02766713856942271,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.02766713856942271
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596157,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596157
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3859191655801825,
"acc_stderr": 0.01243339891147614,
"acc_norm": 0.3859191655801825,
"acc_norm_stderr": 0.01243339891147614
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485697,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485697
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307293,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307293
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5755102040816327,
"acc_stderr": 0.031642094879429414,
"acc_norm": 0.5755102040816327,
"acc_norm_stderr": 0.031642094879429414
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7070008802879916,
"mc2_stderr": 0.015219455818404188
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235802
},
"harness|gsm8k|5": {
"acc": 0.2562547384382108,
"acc_stderr": 0.012025145867332844
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
davanstrien/Haiku_Dataset | ---
dataset_info:
features:
- name: haiku
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 11725740
num_examples: 144123
download_size: 7554208
dataset_size: 11725740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-4.0
task_categories:
- text-generation
language:
- en
pretty_name: Haiku Dataset
---
source: https://www.kaggle.com/datasets/hjhalani30/haiku-dataset |
open-llm-leaderboard/details_bigscience__bloom-1b7 | ---
pretty_name: Evaluation run of bigscience/bloom-1b7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigscience/bloom-1b7](https://huggingface.co/bigscience/bloom-1b7) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloom-1b7\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-04T13:06:13.491181](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-1b7/blob/main/results_2023-12-04T13-06-13.491181.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.008339651250947688,\n\
\ \"acc_stderr\": 0.0025049422268605148\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.008339651250947688,\n \"acc_stderr\": 0.0025049422268605148\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bigscience/bloom-1b7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T16_35_28.358737
path:
- '**/details_harness|drop|3_2023-10-16T16-35-28.358737.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T16-35-28.358737.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T16_35_28.358737
path:
- '**/details_harness|gsm8k|5_2023-10-16T16-35-28.358737.parquet'
- split: 2023_12_03T16_04_08.979472
path:
- '**/details_harness|gsm8k|5_2023-12-03T16-04-08.979472.parquet'
- split: 2023_12_04T09_54_54.675804
path:
- '**/details_harness|gsm8k|5_2023-12-04T09-54-54.675804.parquet'
- split: 2023_12_04T13_06_13.491181
path:
- '**/details_harness|gsm8k|5_2023-12-04T13-06-13.491181.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T13-06-13.491181.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T16_35_28.358737
path:
- '**/details_harness|winogrande|5_2023-10-16T16-35-28.358737.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T16-35-28.358737.parquet'
- config_name: results
data_files:
- split: 2023_10_16T16_35_28.358737
path:
- results_2023-10-16T16-35-28.358737.parquet
- split: 2023_12_03T16_04_08.979472
path:
- results_2023-12-03T16-04-08.979472.parquet
- split: 2023_12_04T09_54_54.675804
path:
- results_2023-12-04T09-54-54.675804.parquet
- split: 2023_12_04T13_06_13.491181
path:
- results_2023-12-04T13-06-13.491181.parquet
- split: latest
path:
- results_2023-12-04T13-06-13.491181.parquet
---
# Dataset Card for Evaluation run of bigscience/bloom-1b7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigscience/bloom-1b7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigscience/bloom-1b7](https://huggingface.co/bigscience/bloom-1b7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloom-1b7",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T13:06:13.491181](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-1b7/blob/main/results_2023-12-04T13-06-13.491181.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.008339651250947688,
"acc_stderr": 0.0025049422268605148
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.0025049422268605148
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CorraMcato/SongcheonGPT | ---
license: openrail
---
|
Erynan/gpt_just_10 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: more_reasonable
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3184
num_examples: 10
download_size: 6058
dataset_size: 3184
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TwoAbove/gpt4v-dataset-test | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
- name: link
dtype: string
- name: message_id
dtype: string
- name: timestamp
dtype: string
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 0
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Use the Edit dataset card button to edit. |
0x-YuAN/voice_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: ID
dtype: string
- name: Sex
dtype: int64
- name: Age
dtype: int64
- name: Disease category
dtype: int64
- name: Narrow pitch range
dtype: int64
- name: Decreased volume
dtype: int64
- name: Fatigue
dtype: int64
- name: Dryness
dtype: int64
- name: Lumping
dtype: int64
- name: heartburn
dtype: int64
- name: Choking
dtype: int64
- name: Eye dryness
dtype: int64
- name: PND
dtype: int64
- name: Smoking
dtype: int64
- name: PPD
dtype: float64
- name: Drinking
dtype: int64
- name: frequency
dtype: int64
- name: Diurnal pattern
dtype: int64
- name: 'Onset of dysphonia '
dtype: int64
- name: Noise at work
dtype: int64
- name: Occupational vocal demand
dtype: int64
- name: Diabetes
dtype: int64
- name: Hypertension
dtype: int64
- name: CAD
dtype: int64
- name: Head and Neck Cancer
dtype: int64
- name: Head injury
dtype: int64
- name: CVA
dtype: int64
- name: Voice handicap index - 10
dtype: float64
splits:
- name: train
num_bytes: 340418666.0
num_examples: 1000
download_size: 323237441
dataset_size: 340418666.0
---
# Dataset Card for "voice_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yardeny/processed_gpt2_context_len_512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 15593335128.0
num_examples: 6072171
download_size: 6562663671
dataset_size: 15593335128.0
---
# Dataset Card for "processed_gpt2_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
myrtotsok/clf-noSbI-fixGQA | ---
dataset_info:
features:
- name: request
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 76547
num_examples: 960
- name: validation
num_bytes: 19144
num_examples: 240
download_size: 25721
dataset_size: 95691
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_openchat__openchat_3.5 | ---
pretty_name: Evaluation run of openchat/openchat_3.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat_3.5](https://huggingface.co/openchat/openchat_3.5) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_3.5_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-19T10:30:18.054013](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_3.5_public/blob/main/results_2023-11-19T10-30-18.054013.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253361427748827,\n\
\ \"acc_stderr\": 0.03243199538325514,\n \"acc_norm\": 0.6324168865850391,\n\
\ \"acc_norm_stderr\": 0.033117338974973515,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133036,\n \"mc2\": 0.4543017595862846,\n\
\ \"mc2_stderr\": 0.015109332514210328,\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642965895,\n \"f1\": 0.0692680369127516,\n\
\ \"f1_stderr\": 0.0014684205896877763\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436174,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6450906193985262,\n\
\ \"acc_stderr\": 0.0047750796365670966,\n \"acc_norm\": 0.839573790081657,\n\
\ \"acc_norm_stderr\": 0.003662508272330902\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462455,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709447,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147893,\n \
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147893\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246572,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709695,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709695\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489274,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489274\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.01593748465668703,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.01593748465668703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013014,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133036,\n \"mc2\": 0.4543017595862846,\n\
\ \"mc2_stderr\": 0.015109332514210328\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989243\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0026216442953020135,\n \
\ \"em_stderr\": 0.0005236685642965895,\n \"f1\": 0.0692680369127516,\n\
\ \"f1_stderr\": 0.0014684205896877763\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.2577710386656558,\n \"acc_stderr\": 0.012048370213576602\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat_3.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|arc:challenge|25_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|arc:challenge|25_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|arc:challenge|25_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|drop|3_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|drop|3_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|drop|3_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|gsm8k|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|gsm8k|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|gsm8k|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hellaswag|10_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hellaswag|10_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hellaswag|10_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T16-15-03.792286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T16-22-29.903207.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-30-18.054013.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T10-30-18.054013.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- '**/details_harness|winogrande|5_2023-11-18T16-15-03.792286.parquet'
- split: 2023_11_18T16_22_29.903207
path:
- '**/details_harness|winogrande|5_2023-11-18T16-22-29.903207.parquet'
- split: 2023_11_19T10_30_18.054013
path:
- '**/details_harness|winogrande|5_2023-11-19T10-30-18.054013.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-19T10-30-18.054013.parquet'
- config_name: results
data_files:
- split: 2023_11_18T16_15_03.792286
path:
- results_2023-11-18T16-15-03.792286.parquet
- split: 2023_11_18T16_22_29.903207
path:
- results_2023-11-18T16-22-29.903207.parquet
- split: 2023_11_19T10_30_18.054013
path:
- results_2023-11-19T10-30-18.054013.parquet
- split: latest
path:
- results_2023-11-19T10-30-18.054013.parquet
---
# Dataset Card for Evaluation run of openchat/openchat_3.5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_3.5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_3.5](https://huggingface.co/openchat/openchat_3.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_3.5_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T10:30:18.054013](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_3.5_public/blob/main/results_2023-11-19T10-30-18.054013.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6253361427748827,
"acc_stderr": 0.03243199538325514,
"acc_norm": 0.6324168865850391,
"acc_norm_stderr": 0.033117338974973515,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133036,
"mc2": 0.4543017595862846,
"mc2_stderr": 0.015109332514210328,
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965895,
"f1": 0.0692680369127516,
"f1_stderr": 0.0014684205896877763
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436174,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6450906193985262,
"acc_stderr": 0.0047750796365670966,
"acc_norm": 0.839573790081657,
"acc_norm_stderr": 0.003662508272330902
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462455,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709447,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147893,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147893
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246572,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709695,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709695
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489274,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489274
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.01593748465668703,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.01593748465668703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013014,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133036,
"mc2": 0.4543017595862846,
"mc2_stderr": 0.015109332514210328
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989243
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965895,
"f1": 0.0692680369127516,
"f1_stderr": 0.0014684205896877763
},
"harness|gsm8k|5": {
"acc": 0.2577710386656558,
"acc_stderr": 0.012048370213576602
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rjaiswal/friends-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 217527103.0
num_examples: 30
download_size: 217511845
dataset_size: 217527103.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "friends-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/art556_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of art556/ART556/ART556 (Girls' Frontline)
This is the dataset of art556/ART556/ART556 (Girls' Frontline), containing 49 images and their tags.
The core tags of this character are `animal_ears, animal_ear_fluff, bow, green_hair, long_hair, hair_bow, hair_between_eyes, bangs, twintails, brown_eyes, green_bow, very_long_hair, breasts, small_breasts, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 49 | 61.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/art556_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 49 | 35.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/art556_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 125 | 81.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/art556_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 49 | 55.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/art556_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 125 | 115.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/art556_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/art556_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | bare_shoulders, crop_top, smile, white_gloves, 1girl, blush, closed_mouth, green_skirt, looking_at_viewer, pleated_skirt, sleeveless_shirt, solo, white_shirt, midriff, suspenders, white_thighhighs, navel, v-shaped_eyebrows, :q, white_footwear, collared_shirt, white_background, black_panties, hand_up, sitting, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | bare_shoulders | crop_top | smile | white_gloves | 1girl | blush | closed_mouth | green_skirt | looking_at_viewer | pleated_skirt | sleeveless_shirt | solo | white_shirt | midriff | suspenders | white_thighhighs | navel | v-shaped_eyebrows | :q | white_footwear | collared_shirt | white_background | black_panties | hand_up | sitting | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------|:-----------|:--------|:---------------|:--------|:--------|:---------------|:--------------|:--------------------|:----------------|:-------------------|:-------|:--------------|:----------|:-------------|:-------------------|:--------|:--------------------|:-----|:-----------------|:-----------------|:-------------------|:----------------|:----------|:----------|:--------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-6.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-6.7B](https://huggingface.co/cerebras/Cerebras-GPT-6.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T00:38:58.365291](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B/blob/main/results_2023-10-15T00-38-58.365291.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.00029649629898012217,\n \"f1\": 0.047345847315436396,\n\
\ \"f1_stderr\": 0.0011636776448840373,\n \"acc\": 0.296260470938676,\n\
\ \"acc_stderr\": 0.007919183184815076\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012217,\n\
\ \"f1\": 0.047345847315436396,\n \"f1_stderr\": 0.0011636776448840373\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480483\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5872138910812944,\n \"acc_stderr\": 0.013837060648682103\n\
\ }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-6.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T00_38_58.365291
path:
- '**/details_harness|drop|3_2023-10-15T00-38-58.365291.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T00-38-58.365291.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T00_38_58.365291
path:
- '**/details_harness|gsm8k|5_2023-10-15T00-38-58.365291.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T00-38-58.365291.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:33:57.181673.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:33:57.181673.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T00_38_58.365291
path:
- '**/details_harness|winogrande|5_2023-10-15T00-38-58.365291.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T00-38-58.365291.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_33_57.181673
path:
- results_2023-07-19T16:33:57.181673.parquet
- split: 2023_10_15T00_38_58.365291
path:
- results_2023-10-15T00-38-58.365291.parquet
- split: latest
path:
- results_2023-10-15T00-38-58.365291.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-6.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-6.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-6.7B](https://huggingface.co/cerebras/Cerebras-GPT-6.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T00:38:58.365291](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B/blob/main/results_2023-10-15T00-38-58.365291.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012217,
"f1": 0.047345847315436396,
"f1_stderr": 0.0011636776448840373,
"acc": 0.296260470938676,
"acc_stderr": 0.007919183184815076
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012217,
"f1": 0.047345847315436396,
"f1_stderr": 0.0011636776448840373
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480483
},
"harness|winogrande|5": {
"acc": 0.5872138910812944,
"acc_stderr": 0.013837060648682103
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Hack90/ref_seq_protozoa | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 4181483666
num_examples: 170662
download_size: 1791079199
dataset_size: 4181483666
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
haoranxu/ALMA-Human-Parallel | ---
dataset_info:
- config_name: cs-en
features:
- name: translation
struct:
- name: cs
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 3432181
num_examples: 12076
- name: validation
num_bytes: 318813
num_examples: 1002
download_size: 0
dataset_size: 3750994
- config_name: de-en
features:
- name: translation
struct:
- name: de
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 4108729
num_examples: 14211
- name: validation
num_bytes: 329855
num_examples: 1002
download_size: 0
dataset_size: 4438584
- config_name: is-en
features:
- name: translation
struct:
- name: is
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 554190
num_examples: 2009
download_size: 0
dataset_size: 554190
- config_name: ru-en
features:
- name: translation
struct:
- name: ru
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 5427552
num_examples: 15000
- name: validation
num_bytes: 442271
num_examples: 1002
download_size: 0
dataset_size: 5869823
- config_name: zh-en
features:
- name: translation
struct:
- name: zh
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 4700299
num_examples: 15406
- name: validation
num_bytes: 285969
num_examples: 1002
download_size: 0
dataset_size: 4986268
configs:
- config_name: cs-en
data_files:
- split: train
path: cs-en/train-*
- split: validation
path: cs-en/validation-*
- config_name: de-en
data_files:
- split: train
path: de-en/train-*
- split: validation
path: de-en/validation-*
- config_name: is-en
data_files:
- split: train
path: is-en/train-*
- config_name: ru-en
data_files:
- split: train
path: ru-en/train-*
- split: validation
path: ru-en/validation-*
- config_name: zh-en
data_files:
- split: train
path: zh-en/train-*
- split: validation
path: zh-en/validation-*
---
# Dataset Card for "ALMA-Human-Parallel"
This is human-written parallel dataset used by [ALMA](https://arxiv.org/abs/2309.11674) translation models.
```
@misc{xu2023paradigm,
title={A Paradigm Shift in Machine Translation: Boosting Translation Performance of Large Language Models},
author={Haoran Xu and Young Jin Kim and Amr Sharaf and Hany Hassan Awadalla},
year={2023},
eprint={2309.11674},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@misc{xu2024contrastive,
title={Contrastive Preference Optimization: Pushing the Boundaries of LLM Performance in Machine Translation},
author={Haoran Xu and Amr Sharaf and Yunmo Chen and Weiting Tan and Lingfeng Shen and Benjamin Van Durme and Kenton Murray and Young Jin Kim},
year={2024},
eprint={2401.08417},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
HaiboinLeeds/eee3_bug | ---
license: mit
---
|
eunbinni/ola_polyglot_5.8B_t2_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 29998082
num_examples: 107174
download_size: 18601058
dataset_size: 29998082
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ola_polyglot_5.8B_t2_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Divyan-shu-Singh/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
clue | ---
annotations_creators:
- other
language_creators:
- other
language:
- zh
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
- multiple-choice
task_ids:
- topic-classification
- semantic-similarity-scoring
- natural-language-inference
- multiple-choice-qa
paperswithcode_id: clue
pretty_name: 'CLUE: Chinese Language Understanding Evaluation benchmark'
tags:
- coreference-nli
- qa-nli
dataset_info:
- config_name: afqmc
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 378718
num_examples: 3861
- name: train
num_bytes: 3396503
num_examples: 34334
- name: validation
num_bytes: 426285
num_examples: 4316
download_size: 2337418
dataset_size: 4201506
- config_name: c3
features:
- name: id
dtype: int32
- name: context
sequence: string
- name: question
dtype: string
- name: choice
sequence: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1600142
num_examples: 1625
- name: train
num_bytes: 9672739
num_examples: 11869
- name: validation
num_bytes: 2990943
num_examples: 3816
download_size: 4718960
dataset_size: 14263824
- config_name: chid
features:
- name: idx
dtype: int32
- name: candidates
sequence: string
- name: content
sequence: string
- name: answers
sequence:
- name: text
dtype: string
- name: candidate_id
dtype: int32
splits:
- name: test
num_bytes: 11480435
num_examples: 3447
- name: train
num_bytes: 252477926
num_examples: 84709
- name: validation
num_bytes: 10117761
num_examples: 3218
download_size: 198468807
dataset_size: 274076122
- config_name: cluewsc2020
features:
- name: idx
dtype: int32
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'true'
'1': 'false'
- name: target
struct:
- name: span1_text
dtype: string
- name: span2_text
dtype: string
- name: span1_index
dtype: int32
- name: span2_index
dtype: int32
splits:
- name: test
num_bytes: 645637
num_examples: 2574
- name: train
num_bytes: 288816
num_examples: 1244
- name: validation
num_bytes: 72670
num_examples: 304
download_size: 380611
dataset_size: 1007123
- config_name: cmnli
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': neutral
'1': entailment
'2': contradiction
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 2386821
num_examples: 13880
- name: train
num_bytes: 67684989
num_examples: 391783
- name: validation
num_bytes: 2051829
num_examples: 12241
download_size: 54234919
dataset_size: 72123639
- config_name: cmrc2018
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: test
num_bytes: 3112042
num_examples: 2000
- name: train
num_bytes: 15508062
num_examples: 10142
- name: validation
num_bytes: 5183785
num_examples: 3219
- name: trial
num_bytes: 1606907
num_examples: 1002
download_size: 5459001
dataset_size: 25410796
- config_name: csl
features:
- name: idx
dtype: int32
- name: corpus_id
dtype: int32
- name: abst
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: keyword
sequence: string
splits:
- name: test
num_bytes: 2463728
num_examples: 3000
- name: train
num_bytes: 16478890
num_examples: 20000
- name: validation
num_bytes: 2464563
num_examples: 3000
download_size: 3936111
dataset_size: 21407181
- config_name: diagnostics
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': neutral
'1': entailment
'2': contradiction
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 42392
num_examples: 514
download_size: 23000
dataset_size: 42392
- config_name: drcd
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: test
num_bytes: 4982378
num_examples: 3493
- name: train
num_bytes: 37443386
num_examples: 26936
- name: validation
num_bytes: 5222729
num_examples: 3524
download_size: 11188875
dataset_size: 47648493
- config_name: iflytek
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
'10': '10'
'11': '11'
'12': '12'
'13': '13'
'14': '14'
'15': '15'
'16': '16'
'17': '17'
'18': '18'
'19': '19'
'20': '20'
'21': '21'
'22': '22'
'23': '23'
'24': '24'
'25': '25'
'26': '26'
'27': '27'
'28': '28'
'29': '29'
'30': '30'
'31': '31'
'32': '32'
'33': '33'
'34': '34'
'35': '35'
'36': '36'
'37': '37'
'38': '38'
'39': '39'
'40': '40'
'41': '41'
'42': '42'
'43': '43'
'44': '44'
'45': '45'
'46': '46'
'47': '47'
'48': '48'
'49': '49'
'50': '50'
'51': '51'
'52': '52'
'53': '53'
'54': '54'
'55': '55'
'56': '56'
'57': '57'
'58': '58'
'59': '59'
'60': '60'
'61': '61'
'62': '62'
'63': '63'
'64': '64'
'65': '65'
'66': '66'
'67': '67'
'68': '68'
'69': '69'
'70': '70'
'71': '71'
'72': '72'
'73': '73'
'74': '74'
'75': '75'
'76': '76'
'77': '77'
'78': '78'
'79': '79'
'80': '80'
'81': '81'
'82': '82'
'83': '83'
'84': '84'
'85': '85'
'86': '86'
'87': '87'
'88': '88'
'89': '89'
'90': '90'
'91': '91'
'92': '92'
'93': '93'
'94': '94'
'95': '95'
'96': '96'
'97': '97'
'98': '98'
'99': '99'
'100': '100'
'101': '101'
'102': '102'
'103': '103'
'104': '104'
'105': '105'
'106': '106'
'107': '107'
'108': '108'
'109': '109'
'110': '110'
'111': '111'
'112': '112'
'113': '113'
'114': '114'
'115': '115'
'116': '116'
'117': '117'
'118': '118'
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 2105684
num_examples: 2600
- name: train
num_bytes: 10028605
num_examples: 12133
- name: validation
num_bytes: 2157119
num_examples: 2599
download_size: 9777855
dataset_size: 14291408
- config_name: ocnli
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': neutral
'1': entailment
'2': contradiction
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 376058
num_examples: 3000
- name: train
num_bytes: 6187142
num_examples: 50437
- name: validation
num_bytes: 366227
num_examples: 2950
download_size: 3000218
dataset_size: 6929427
- config_name: tnews
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': '100'
'1': '101'
'2': '102'
'3': '103'
'4': '104'
'5': '106'
'6': '107'
'7': '108'
'8': '109'
'9': '110'
'10': '112'
'11': '113'
'12': '114'
'13': '115'
'14': '116'
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 810970
num_examples: 10000
- name: train
num_bytes: 4245677
num_examples: 53360
- name: validation
num_bytes: 797922
num_examples: 10000
download_size: 4697843
dataset_size: 5854569
configs:
- config_name: afqmc
data_files:
- split: test
path: afqmc/test-*
- split: train
path: afqmc/train-*
- split: validation
path: afqmc/validation-*
- config_name: c3
data_files:
- split: test
path: c3/test-*
- split: train
path: c3/train-*
- split: validation
path: c3/validation-*
- config_name: chid
data_files:
- split: test
path: chid/test-*
- split: train
path: chid/train-*
- split: validation
path: chid/validation-*
- config_name: cluewsc2020
data_files:
- split: test
path: cluewsc2020/test-*
- split: train
path: cluewsc2020/train-*
- split: validation
path: cluewsc2020/validation-*
- config_name: cmnli
data_files:
- split: test
path: cmnli/test-*
- split: train
path: cmnli/train-*
- split: validation
path: cmnli/validation-*
- config_name: cmrc2018
data_files:
- split: test
path: cmrc2018/test-*
- split: train
path: cmrc2018/train-*
- split: validation
path: cmrc2018/validation-*
- split: trial
path: cmrc2018/trial-*
- config_name: csl
data_files:
- split: test
path: csl/test-*
- split: train
path: csl/train-*
- split: validation
path: csl/validation-*
- config_name: diagnostics
data_files:
- split: test
path: diagnostics/test-*
- config_name: drcd
data_files:
- split: test
path: drcd/test-*
- split: train
path: drcd/train-*
- split: validation
path: drcd/validation-*
- config_name: iflytek
data_files:
- split: test
path: iflytek/test-*
- split: train
path: iflytek/train-*
- split: validation
path: iflytek/validation-*
- config_name: ocnli
data_files:
- split: test
path: ocnli/test-*
- split: train
path: ocnli/train-*
- split: validation
path: ocnli/validation-*
- config_name: tnews
data_files:
- split: test
path: tnews/test-*
- split: train
path: tnews/train-*
- split: validation
path: tnews/validation-*
---
# Dataset Card for "clue"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.cluebenchmarks.com
- **Repository:** https://github.com/CLUEbenchmark/CLUE
- **Paper:** [CLUE: A Chinese Language Understanding Evaluation Benchmark](https://aclanthology.org/2020.coling-main.419/)
- **Paper:** https://arxiv.org/abs/2004.05986
- **Point of Contact:** [Zhenzhong Lan](mailto:lanzhenzhong@westlake.edu.cn)
- **Size of downloaded dataset files:** 198.68 MB
- **Size of the generated dataset:** 486.34 MB
- **Total amount of disk used:** 685.02 MB
### Dataset Summary
CLUE, A Chinese Language Understanding Evaluation Benchmark
(https://www.cluebenchmarks.com/) is a collection of resources for training,
evaluating, and analyzing Chinese language understanding systems.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### afqmc
- **Size of downloaded dataset files:** 1.20 MB
- **Size of the generated dataset:** 4.20 MB
- **Total amount of disk used:** 5.40 MB
An example of 'validation' looks as follows.
```
{
"idx": 0,
"label": 0,
"sentence1": "双十一花呗提额在哪",
"sentence2": "里可以提花呗额度"
}
```
#### c3
- **Size of downloaded dataset files:** 3.20 MB
- **Size of the generated dataset:** 15.69 MB
- **Total amount of disk used:** 18.90 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"answer": "比人的灵敏",
"choice": ["没有人的灵敏", "和人的差不多", "和人的一样好", "比人的灵敏"],
"context": "[\"许多动物的某些器官感觉特别灵敏,它们能比人类提前知道一些灾害事件的发生,例如,海洋中的水母能预报风暴,老鼠能事先躲避矿井崩塌或有害气体,等等。地震往往能使一些动物的某些感觉器官受到刺激而发生异常反应。如一个地区的重力发生变异,某些动物可能通过它们的平衡...",
"id": 1,
"question": "动物的器官感觉与人的相比有什么不同?"
}
```
#### chid
- **Size of downloaded dataset files:** 139.20 MB
- **Size of the generated dataset:** 274.08 MB
- **Total amount of disk used:** 413.28 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"candidate_id": [3, 5, 6, 1, 7, 4, 0],
"text": ["碌碌无为", "无所作为", "苦口婆心", "得过且过", "未雨绸缪", "软硬兼施", "传宗接代"]
},
"candidates": "[\"传宗接代\", \"得过且过\", \"咄咄逼人\", \"碌碌无为\", \"软硬兼施\", \"无所作为\", \"苦口婆心\", \"未雨绸缪\", \"和衷共济\", \"人老珠黄\"]...",
"content": "[\"谈到巴萨目前的成就,瓜迪奥拉用了“坚持”两个字来形容。自从上世纪90年代克鲁伊夫带队以来,巴萨就坚持每年都有拉玛西亚球员进入一队的传统。即便是范加尔时代,巴萨强力推出的“巴萨五鹰”德拉·佩纳、哈维、莫雷罗、罗杰·加西亚和贝拉乌桑几乎#idiom0000...",
"idx": 0
}
```
#### cluewsc2020
- **Size of downloaded dataset files:** 0.28 MB
- **Size of the generated dataset:** 1.03 MB
- **Total amount of disk used:** 1.29 MB
An example of 'train' looks as follows.
```
{
"idx": 0,
"label": 1,
"target": {
"span1_index": 3,
"span1_text": "伤口",
"span2_index": 27,
"span2_text": "它们"
},
"text": "裂开的伤口涂满尘土,里面有碎石子和木头刺,我小心翼翼把它们剔除出去。"
}
```
#### cmnli
- **Size of downloaded dataset files:** 31.40 MB
- **Size of the generated dataset:** 72.12 MB
- **Total amount of disk used:** 103.53 MB
An example of 'train' looks as follows.
```
{
"idx": 0,
"label": 0,
"sentence1": "从概念上讲,奶油略读有两个基本维度-产品和地理。",
"sentence2": "产品和地理位置是使奶油撇油起作用的原因。"
}
```
### Data Fields
The data fields are the same among all splits.
#### afqmc
- `sentence1`: a `string` feature.
- `sentence2`: a `string` feature.
- `label`: a classification label, with possible values including `0` (0), `1` (1).
- `idx`: a `int32` feature.
#### c3
- `id`: a `int32` feature.
- `context`: a `list` of `string` features.
- `question`: a `string` feature.
- `choice`: a `list` of `string` features.
- `answer`: a `string` feature.
#### chid
- `idx`: a `int32` feature.
- `candidates`: a `list` of `string` features.
- `content`: a `list` of `string` features.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `candidate_id`: a `int32` feature.
#### cluewsc2020
- `idx`: a `int32` feature.
- `text`: a `string` feature.
- `label`: a classification label, with possible values including `true` (0), `false` (1).
- `span1_text`: a `string` feature.
- `span2_text`: a `string` feature.
- `span1_index`: a `int32` feature.
- `span2_index`: a `int32` feature.
#### cmnli
- `sentence1`: a `string` feature.
- `sentence2`: a `string` feature.
- `label`: a classification label, with possible values including `neutral` (0), `entailment` (1), `contradiction` (2).
- `idx`: a `int32` feature.
### Data Splits
| name |train |validation|test |
|-----------|-----:|---------:|----:|
|afqmc | 34334| 4316| 3861|
|c3 | 11869| 3816| 3892|
|chid | 84709| 3218| 3231|
|cluewsc2020| 1244| 304| 290|
|cmnli |391783| 12241|13880|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{xu-etal-2020-clue,
title = "{CLUE}: A {C}hinese Language Understanding Evaluation Benchmark",
author = "Xu, Liang and
Hu, Hai and
Zhang, Xuanwei and
Li, Lu and
Cao, Chenjie and
Li, Yudong and
Xu, Yechen and
Sun, Kai and
Yu, Dian and
Yu, Cong and
Tian, Yin and
Dong, Qianqian and
Liu, Weitang and
Shi, Bo and
Cui, Yiming and
Li, Junyi and
Zeng, Jun and
Wang, Rongzhao and
Xie, Weijian and
Li, Yanting and
Patterson, Yina and
Tian, Zuoyu and
Zhang, Yiwen and
Zhou, He and
Liu, Shaoweihua and
Zhao, Zhe and
Zhao, Qipeng and
Yue, Cong and
Zhang, Xinrui and
Yang, Zhengliang and
Richardson, Kyle and
Lan, Zhenzhong",
booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
month = dec,
year = "2020",
address = "Barcelona, Spain (Online)",
publisher = "International Committee on Computational Linguistics",
url = "https://aclanthology.org/2020.coling-main.419",
doi = "10.18653/v1/2020.coling-main.419",
pages = "4762--4772",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@JetRunner](https://github.com/JetRunner) for adding this dataset. |
open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b | ---
pretty_name: Evaluation run of h2oai/h2ogpt-oig-oasst1-256-6_9b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-oig-oasst1-256-6_9b](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T02:25:51.324956](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b/blob/main/results_2023-09-23T02-25-51.324956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.0002568002749723939,\n \"f1\": 0.04599517617449677,\n\
\ \"f1_stderr\": 0.0011593544147047532,\n \"acc\": 0.3248508682225,\n\
\ \"acc_stderr\": 0.008493981824488952\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.0002568002749723939,\n\
\ \"f1\": 0.04599517617449677,\n \"f1_stderr\": 0.0011593544147047532\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6337805840568271,\n \"acc_stderr\": 0.013540144376588901\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T02_25_51.324956
path:
- '**/details_harness|drop|3_2023-09-23T02-25-51.324956.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T02-25-51.324956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T02_25_51.324956
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-25-51.324956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T02-25-51.324956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:44:24.016368.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:44:24.016368.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T02_25_51.324956
path:
- '**/details_harness|winogrande|5_2023-09-23T02-25-51.324956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T02-25-51.324956.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_44_24.016368
path:
- results_2023-07-19T17:44:24.016368.parquet
- split: 2023_09_23T02_25_51.324956
path:
- results_2023-09-23T02-25-51.324956.parquet
- split: latest
path:
- results_2023-09-23T02-25-51.324956.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-oig-oasst1-256-6_9b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-oig-oasst1-256-6_9b](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T02:25:51.324956](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b/blob/main/results_2023-09-23T02-25-51.324956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723939,
"f1": 0.04599517617449677,
"f1_stderr": 0.0011593544147047532,
"acc": 0.3248508682225,
"acc_stderr": 0.008493981824488952
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723939,
"f1": 0.04599517617449677,
"f1_stderr": 0.0011593544147047532
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890037
},
"harness|winogrande|5": {
"acc": 0.6337805840568271,
"acc_stderr": 0.013540144376588901
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/watatsuki_no_toyohime_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of watatsuki_no_toyohime/綿月豊姫 (Touhou)
This is the dataset of watatsuki_no_toyohime/綿月豊姫 (Touhou), containing 265 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hat, yellow_eyes, ribbon, bow, hat_ribbon, breasts, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 265 | 199.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_toyohime_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 265 | 146.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_toyohime_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 478 | 260.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_toyohime_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 265 | 187.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_toyohime_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 478 | 323.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_toyohime_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/watatsuki_no_toyohime_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bangs, belt, long_sleeves, looking_at_viewer, purple_dress, simple_background, solo, white_background, white_shirt, buttons, collared_shirt, hat_bow, single_strap, large_breasts, long_dress, one-hour_drawing_challenge, blush, brown_eyes, full_body, hair_between_eyes, open_mouth, pinafore_dress, purple_bow, smile, wavy_hair |
| 1 | 17 |  |  |  |  |  | 1girl, bangs, smile, solo, white_shirt, hat_bow, long_sleeves, purple_dress, holding_fan, looking_at_viewer, collared_shirt, folding_fan, belt, blush, purple_bow, closed_mouth, buttons, simple_background, blue_dress, brown_eyes, cowboy_shot, single_strap, purple_ribbon, hair_between_eyes, pinafore_dress, standing, upper_body |
| 2 | 5 |  |  |  |  |  | 1girl, belt, looking_at_viewer, shirt, smile, solo, blush, hat_bow, juliet_sleeves, purple_dress, simple_background, sitting, very_long_hair |
| 3 | 25 |  |  |  |  |  | 1girl, solo, belt, dress, smile, folding_fan |
| 4 | 6 |  |  |  |  |  | 1girl, peach, solo, belt, open_mouth, smile, dress |
| 5 | 5 |  |  |  |  |  | 2girls, peach, smile, blush, dress, belt, open_mouth, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bangs | belt | long_sleeves | looking_at_viewer | purple_dress | simple_background | solo | white_background | white_shirt | buttons | collared_shirt | hat_bow | single_strap | large_breasts | long_dress | one-hour_drawing_challenge | blush | brown_eyes | full_body | hair_between_eyes | open_mouth | pinafore_dress | purple_bow | smile | wavy_hair | holding_fan | folding_fan | closed_mouth | blue_dress | cowboy_shot | purple_ribbon | standing | upper_body | shirt | juliet_sleeves | sitting | very_long_hair | dress | peach | 2girls |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:---------------|:--------------------|:---------------|:--------------------|:-------|:-------------------|:--------------|:----------|:-----------------|:----------|:---------------|:----------------|:-------------|:-----------------------------|:--------|:-------------|:------------|:--------------------|:-------------|:-----------------|:-------------|:--------|:------------|:--------------|:--------------|:---------------|:-------------|:--------------|:----------------|:-----------|:-------------|:--------|:-----------------|:----------|:-----------------|:--------|:--------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | X | X | | | | X | X | | X | | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | X | X | X | X | | | | | X | | | | | X | | | | | | | X | | | | | | | | | | X | X | X | X | | | |
| 3 | 25 |  |  |  |  |  | X | | X | | | | | X | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | X | | |
| 4 | 6 |  |  |  |  |  | X | | X | | | | | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | X | X | |
| 5 | 5 |  |  |  |  |  | | | X | | | | | | | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | X | | X | X | X |
|
Apinapi/LucasD | ---
license: openrail
---
|
chrisaydat/applestockpricehistory | ---
license: openrail
---
The AAPL Stock dataset includes daily historical stock market data for Apple Inc. from the New York Stock Exchange (NYSE) from 12th December 1980 to April 1st 2023 The dataset includes six columns: Date, Open, High, Low, Close, Adjusted Close, and Volume.
The "Date" column contains the date of each trading day, while the "Open," "High," "Low," and "Close" columns represent the stock prices of Apple Inc. at the opening, highest, lowest, and closing points of each trading day, respectively.
The "Adjusted Close" column is the closing price adjusted to reflect any corporate actions, such as stock splits, dividends, or mergers and acquisitions. Finally, the "Volume" column indicates the number of shares of Apple Inc. that were traded on each trading day.
This dataset can be used to analyze the historical performance of Apple Inc. stock, to create predictive models, or to perform technical analysis to identify patterns or trends in the data. It is suitable for use in financial analysis, machine learning, and other data-driven applications. |
copenlu/fever_gold_evidence | ---
annotations_creators:
- machine-generated
- expert-generated
language_creators:
- machine-generated
- crowdsourced
language:
- en
license:
- cc-by-sa-3.0
- gpl-3.0
multilinguality:
- monolingual
paperswithcode_id: fever
pretty_name: ''
size_categories:
- 100K<n<1M
source_datasets:
- extended|fever
task_categories:
- text-classification
task_ids:
- fact-checking
---
# Dataset Card for fever_gold_evidence
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://github.com/copenlu/fever-adversarial-attacks
- **Repository:** https://github.com/copenlu/fever-adversarial-attacks
- **Paper:** https://aclanthology.org/2020.emnlp-main.256/
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
Dataset for training classification-only fact checking with claims from the FEVER dataset.
This dataset is used in the paper "Generating Label Cohesive and Well-Formed Adversarial Claims", EMNLP 2020
The evidence is the gold evidence from the FEVER dataset for *REFUTE* and *SUPPORT* claims.
For *NEI* claims, we extract evidence sentences with the system in "Christopher Malon. 2018. Team Papelo: Transformer Networks at FEVER. In Proceedings of the
First Workshop on Fact Extraction and VERification (FEVER), pages 109113."
More details can be found in https://github.com/copenlu/fever-adversarial-attacks
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
[Needs More Information]
## Dataset Structure
### Data Instances
[Needs More Information]
### Data Fields
[Needs More Information]
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
```
@inproceedings{atanasova-etal-2020-generating,
title = "Generating Label Cohesive and Well-Formed Adversarial Claims",
author = "Atanasova, Pepa and
Wright, Dustin and
Augenstein, Isabelle",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.emnlp-main.256",
doi = "10.18653/v1/2020.emnlp-main.256",
pages = "3168--3177",
abstract = "Adversarial attacks reveal important vulnerabilities and flaws of trained models. One potent type of attack are universal adversarial triggers, which are individual n-grams that, when appended to instances of a class under attack, can trick a model into predicting a target class. However, for inference tasks such as fact checking, these triggers often inadvertently invert the meaning of instances they are inserted in. In addition, such attacks produce semantically nonsensical inputs, as they simply concatenate triggers to existing samples. Here, we investigate how to generate adversarial attacks against fact checking systems that preserve the ground truth meaning and are semantically valid. We extend the HotFlip attack algorithm used for universal trigger generation by jointly minimizing the target class loss of a fact checking model and the entailment class loss of an auxiliary natural language inference model. We then train a conditional language model to generate semantically valid statements, which include the found universal triggers. We find that the generated attacks maintain the directionality and semantic validity of the claim better than previous work.",
}
``` |
voidful/hint-lm-data | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
sequence: string
- name: answer
dtype: string
- name: hint_chatgpt
dtype: string
splits:
- name: hotpotqa_train
num_bytes: 520829
num_examples: 5481
- name: hotpotqa_validation
num_bytes: 82639
num_examples: 458
- name: openbookqa_test
num_bytes: 121454
num_examples: 500
- name: openbookqa_train
num_bytes: 830308
num_examples: 4957
- name: openbookqa_validation
num_bytes: 91011
num_examples: 500
- name: strategyqa_full
num_bytes: 255888
num_examples: 2290
- name: strategyqa_test
num_bytes: 88443
num_examples: 500
- name: strategyqa_train
num_bytes: 167445
num_examples: 1790
- name: truthfulqa_full
num_bytes: 351912
num_examples: 817
- name: truthfulqa_test
num_bytes: 228633
num_examples: 500
- name: truthfulqa_train
num_bytes: 123279
num_examples: 317
download_size: 1612358
dataset_size: 2861841
configs:
- config_name: default
data_files:
- split: hotpotqa_train
path: data/hotpotqa_train-*
- split: hotpotqa_validation
path: data/hotpotqa_validation-*
- split: openbookqa_test
path: data/openbookqa_test-*
- split: openbookqa_train
path: data/openbookqa_train-*
- split: openbookqa_validation
path: data/openbookqa_validation-*
- split: strategyqa_full
path: data/strategyqa_full-*
- split: strategyqa_test
path: data/strategyqa_test-*
- split: strategyqa_train
path: data/strategyqa_train-*
- split: truthfulqa_full
path: data/truthfulqa_full-*
- split: truthfulqa_test
path: data/truthfulqa_test-*
- split: truthfulqa_train
path: data/truthfulqa_train-*
---
# Dataset Card for "hint-lm-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
achang/stocks_one_nvda_v3_weekly | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2471045
num_examples: 1539
download_size: 148768
dataset_size: 2471045
---
# Dataset Card for "stocks_one_nvda_v3_weekly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xezpeleta/oasst1_eu | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 82758279
num_examples: 81954
- name: validation
num_bytes: 3694724
num_examples: 3629
download_size: 28865242
dataset_size: 86453003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
autoevaluate/autoeval-staging-eval-project-6489fc46-7764981 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: winegarj/distilbert-base-uncased-finetuned-sst2
metrics: []
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: winegarj/distilbert-base-uncased-finetuned-sst2
* Dataset: glue
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Yuhthe/vietnews | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: guid
dtype: int64
- name: title
dtype: string
- name: abstract
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 325418455
num_examples: 99134
- name: validation
num_bytes: 73397317
num_examples: 22184
- name: test
num_bytes: 74536959
num_examples: 22498
download_size: 241345943
dataset_size: 473352731
task_categories:
- summarization
language:
- vi
---
# Dataset Card for "vietnews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HK83/Anime | ---
license: afl-3.0
---
|
samolego/OASST1-slovene | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: validation
num_bytes: 3265400
num_examples: 3245
- name: train
num_bytes: 80318558
num_examples: 80865
download_size: 27643869
dataset_size: 83583958
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
---
|
ibragim-bad/bigbench-superglue-tsi | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task
dtype: string
splits:
- name: bigbench
num_bytes: 2889038
num_examples: 5670
- name: superglue
num_bytes: 1046139
num_examples: 2966
- name: tsi
num_bytes: 1981886
num_examples: 5000
download_size: 2835363
dataset_size: 5917063
---
# Dataset Card for "bigbench-superglue-tsi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ksmp/Faq-dataset | ---
license: mit
---
|
allenai/s2-naip | ---
license: apache-2.0
---
NAIP-S2 is a super-resolution dataset for remote sensing consisting of paired NAIP and Sentinel-2 images in the continental US.
Data is divided into tiles.
Each tile spans 512x512 pixels at 1.25 m/pixel in a UTM projection.
At each tile, the following data is available:
- NAIP: an image from 2019-2021 at 1.25 m/pixel (512x512).
- Sentinel-2: between 16 and 32 images captured within a few months of the NAIP image at 10 m/pixel (64x64).
- Sentinel-1: 4 images captured in the same year as the NAIP image at 10 m/pixel (64x64).
- Landsat: 4 images captured in the same year as the NAIP image at 10 m/pixel (64x64).
- OpenStreetMap: a GeoJSON containing buildings, roads, and 30 other categories. It uses pixel coordinates relative to the 512x512 NAIP image.
- WorldCover: the 2021 land cover image at 10 m/pixel (64x64).
Structure
---------
Once extracted, the dataset contains the different data types in different folders.
Each folder contains files named by a tile ID, which consists of the UTM projection, column, and row.
The column and row are based on tiles that are 512x512 pixels with pixel coordinates at 1.25 m/pixel, e.g. `32612_960_-6049.png` spans (614400, -3871360) to (615040, -3870720) in projection units.
Here is an example of NAIP data:
```
naip/
32612_960_-6049.png
32612_960_-6050.png
32612_960_-6051.png
...
```
And an example of Sentinel-2 data:
```
sentinel2/
32612_960_-6049_16.tif
32612_960_-6049_32.tif
32612_960_-6049_8.tif
32612_960_-6050_16.tif
...
```
Note that the Sentinel-2 images are GeoTIFFs so they contain georeference metadata.
Furthermore, the 10 m/pixel (`_8.tif`), 20 m/pixel (`_16.tif`), and 60 m/pixel (`_32.tif`) bands are stored separately. |
result-kand2-sdxl-wuerst-karlo/7f9071c2 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 171
num_examples: 10
download_size: 1324
dataset_size: 171
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "7f9071c2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AmjedBel/fill1000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 9029264.62
num_examples: 1000
download_size: 6258237
dataset_size: 9029264.62
---
# Dataset Card for "fill1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bdsaglam/musique-jerx-sft-st-ss-openai | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 109114
num_examples: 110
download_size: 32003
dataset_size: 109114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-cn-llm-leaderboard/dynamic_model_information | ---
license: apache-2.0
---
|
narrativeqa_manual | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids:
- abstractive-qa
paperswithcode_id: narrativeqa
pretty_name: NarrativeQA
dataset_info:
features:
- name: document
struct:
- name: id
dtype: string
- name: kind
dtype: string
- name: url
dtype: string
- name: file_size
dtype: int32
- name: word_count
dtype: int32
- name: start
dtype: string
- name: end
dtype: string
- name: summary
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: answers
list:
- name: text
dtype: string
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 9115940054
num_examples: 32747
- name: test
num_bytes: 2911702563
num_examples: 10557
- name: validation
num_bytes: 968994186
num_examples: 3461
download_size: 22638273
dataset_size: 12996636803
---
# Dataset Card for Narrative QA Manual
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [NarrativeQA Homepage](https://deepmind.com/research/open-source/narrativeqa)
- **Repository:** [NarrativeQA Repo](https://github.com/deepmind/narrativeqa)
- **Paper:** [The NarrativeQA Reading Comprehension Challenge](https://arxiv.org/pdf/1712.07040.pdf)
- **Leaderboard:**
- **Point of Contact:** [Tomáš Kočiský](mailto:tkocisky@google.com) [Jonathan Schwarz](mailto:schwarzjn@google.com) [Phil Blunsom](pblunsom@google.com) [Chris Dyer](cdyer@google.com) [Karl Moritz Hermann](mailto:kmh@google.com) [Gábor Melis](mailto:melisgl@google.com) [Edward Grefenstette](mailto:etg@google.com)
### Dataset Summary
NarrativeQA Manual is an English-language dataset of stories and corresponding questions designed to test reading comprehension, especially on long documents. THIS DATASET REQUIRES A MANUALLY DOWNLOADED FILE! Because of a script in the original repository which downloads the stories from original URLs everytime, the links are sometimes broken or invalid. Therefore, you need to manually download the stories for this dataset using the script provided by the authors (https://github.com/deepmind/narrativeqa/blob/master/download_stories.sh). Running the shell script creates a folder named "tmp" in the root directory and downloads the stories there. This folder containing the stories can be used to load the dataset via `datasets.load_dataset("narrativeqa_manual", data_dir="<path/to/folder>")`.
### Supported Tasks and Leaderboards
The dataset is used to test reading comprehension. There are 2 tasks proposed in the paper: "summaries only" and "stories only", depending on whether the human-generated summary or the full story text is used to answer the question.
### Languages
English
## Dataset Structure
### Data Instances
A typical data point consists of a question and answer pair along with a summary/story which can be used to answer the question. Additional information such as the url, word count, wikipedia page, are also provided.
A typical example looks like this:
```
{
"document": {
"id": "23jncj2n3534563110",
"kind": "movie",
"url": "https://www.imsdb.com/Movie%20Scripts/Name%20of%20Movie.html",
"file_size": 80473,
"word_count": 41000,
"start": "MOVIE screenplay by",
"end": ". THE END",
"summary": {
"text": "Joe Bloggs begins his journey exploring...",
"tokens": ["Joe", "Bloggs", "begins", "his", "journey", "exploring",...],
"url": "http://en.wikipedia.org/wiki/Name_of_Movie",
"title": "Name of Movie (film)"
},
"text": "MOVIE screenplay by John Doe\nSCENE 1..."
},
"question": {
"text": "Where does Joe Bloggs live?",
"tokens": ["Where", "does", "Joe", "Bloggs", "live", "?"],
},
"answers": [
{"text": "At home", "tokens": ["At", "home"]},
{"text": "His house", "tokens": ["His", "house"]}
]
}
```
### Data Fields
- `document.id` - Unique ID for the story.
- `document.kind` - "movie" or "gutenberg" depending on the source of the story.
- `document.url` - The URL where the story was downloaded from.
- `document.file_size` - File size (in bytes) of the story.
- `document.word_count` - Number of tokens in the story.
- `document.start` - First 3 tokens of the story. Used for verifying the story hasn't been modified.
- `document.end` - Last 3 tokens of the story. Used for verifying the story hasn't been modified.
- `document.summary.text` - Text of the wikipedia summary of the story.
- `document.summary.tokens` - Tokenized version of `document.summary.text`.
- `document.summary.url` - Wikipedia URL of the summary.
- `document.summary.title` - Wikipedia Title of the summary.
- `question` - `{"text":"...", "tokens":[...]}` for the question about the story.
- `answers` - List of `{"text":"...", "tokens":[...]}` for valid answers for the question.
### Data Splits
The data is split into training, valiudation, and test sets based on story (i.e. the same story cannot appear in more than one split):
| Train | Valid | Test |
| ------ | ----- | ----- |
| 32747 | 3461 | 10557 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Stories and movies scripts were downloaded from [Project Gutenburg](https://www.gutenberg.org) and a range of movie script repositories (mainly [imsdb](http://www.imsdb.com)).
#### Who are the source language producers?
The language producers are authors of the stories and scripts as well as Amazon Turk workers for the questions.
### Annotations
#### Annotation process
Amazon Turk Workers were provided with human written summaries of the stories (To make the annotation tractable and to lead annotators towards asking non-localized questions). Stories were matched with plot summaries from Wikipedia using titles and verified the matching with help from human annotators. The annotators were asked to determine if both the story and the summary refer to a movie or a book (as some books are made into movies), or if they are the same part in a series produced in the same year. Annotators on Amazon Mechanical Turk were instructed to write 10 question–answer pairs each based solely on a given summary. Annotators were instructed to imagine that they are writing questions to test students who have read the full stories but not the summaries. We required questions that are specific enough, given the length and complexity of the narratives, and to provide adiverse set of questions about characters, events, why this happened, and so on. Annotators were encouraged to use their own words and we prevented them from copying. We asked for answers that are grammatical, complete sentences, and explicitly allowed short answers (one word, or a few-word phrase, or ashort sentence) as we think that answering with a full sentence is frequently perceived as artificial when asking about factual information. Annotators were asked to avoid extra, unnecessary information in the question or the answer, and to avoid yes/no questions or questions about the author or the actors.
#### Who are the annotators?
Amazon Mechanical Turk workers.
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is released under a [Apache-2.0 License](https://github.com/deepmind/narrativeqa/blob/master/LICENSE).
### Citation Information
```
@article{narrativeqa,
author = {Tom\'a\v s Ko\v cisk\'y and Jonathan Schwarz and Phil Blunsom and
Chris Dyer and Karl Moritz Hermann and G\'abor Melis and
Edward Grefenstette},
title = {The {NarrativeQA} Reading Comprehension Challenge},
journal = {Transactions of the Association for Computational Linguistics},
url = {https://TBD},
volume = {TBD},
year = {2018},
pages = {TBD},
}
```
### Contributions
Thanks to [@rsanjaykamath](https://github.com/rsanjaykamath) for adding this dataset. |
CyberHarem/tsukumo_benben_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tsukumo_benben/九十九弁々 (Touhou)
This is the dataset of tsukumo_benben/九十九弁々 (Touhou), containing 84 images and their tags.
The core tags of this character are `long_hair, purple_hair, hair_flower, hair_ornament, twintails, purple_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 84 | 114.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_benben_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 84 | 74.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_benben_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 174 | 125.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_benben_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 84 | 104.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_benben_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 174 | 162.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsukumo_benben_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tsukumo_benben_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, biwa_lute, dress, flower, solo, chain, long_sleeves, musical_note, smile, looking_at_viewer, barefoot, open_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, biwa_lute, dress, flower, smile, solo, long_sleeves, playing_instrument, closed_eyes, chain, quarter_note, staff_(music) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | biwa_lute | dress | flower | solo | chain | long_sleeves | musical_note | smile | looking_at_viewer | barefoot | open_mouth | playing_instrument | closed_eyes | quarter_note | staff_(music) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:--------|:---------|:-------|:--------|:---------------|:---------------|:--------|:--------------------|:-----------|:-------------|:---------------------|:--------------|:---------------|:----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | X | X | X | X |
|
open-llm-leaderboard/details_Gille__StrangeMerges_40-7B-dare_ties | ---
pretty_name: Evaluation run of Gille/StrangeMerges_40-7B-dare_ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_40-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_40-7B-dare_ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_40-7B-dare_ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T11:53:41.512785](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_40-7B-dare_ties/blob/main/results_2024-03-21T11-53-41.512785.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6519671462530084,\n\
\ \"acc_stderr\": 0.03215820528540388,\n \"acc_norm\": 0.6514117640957972,\n\
\ \"acc_norm_stderr\": 0.03282902861786898,\n \"mc1\": 0.616891064871481,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.7720923476730807,\n\
\ \"mc2_stderr\": 0.013887458971622677\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7226648078072098,\n\
\ \"acc_stderr\": 0.004467684132772412,\n \"acc_norm\": 0.8861780521808404,\n\
\ \"acc_norm_stderr\": 0.0031694581233577238\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n\
\ \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n\
\ \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n\
\ \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n\
\ \"acc_stderr\": 0.016666783616525776,\n \"acc_norm\": 0.4592178770949721,\n\
\ \"acc_norm_stderr\": 0.016666783616525776\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863935,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863935\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.616891064871481,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.7720923476730807,\n\
\ \"mc2_stderr\": 0.013887458971622677\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6800606520090978,\n \
\ \"acc_stderr\": 0.012848426555240763\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_40-7B-dare_ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-53-41.512785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-53-41.512785.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- '**/details_harness|winogrande|5_2024-03-21T11-53-41.512785.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T11-53-41.512785.parquet'
- config_name: results
data_files:
- split: 2024_03_21T11_53_41.512785
path:
- results_2024-03-21T11-53-41.512785.parquet
- split: latest
path:
- results_2024-03-21T11-53-41.512785.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_40-7B-dare_ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_40-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_40-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_40-7B-dare_ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T11:53:41.512785](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_40-7B-dare_ties/blob/main/results_2024-03-21T11-53-41.512785.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6519671462530084,
"acc_stderr": 0.03215820528540388,
"acc_norm": 0.6514117640957972,
"acc_norm_stderr": 0.03282902861786898,
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.7720923476730807,
"mc2_stderr": 0.013887458971622677
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.7226648078072098,
"acc_stderr": 0.004467684132772412,
"acc_norm": 0.8861780521808404,
"acc_norm_stderr": 0.0031694581233577238
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525776,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863935,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863935
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.7720923476730807,
"mc2_stderr": 0.013887458971622677
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.6800606520090978,
"acc_stderr": 0.012848426555240763
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/eval_tag_squad_v9 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 13273785
num_examples: 10570
- name: validation
num_bytes: 13273785
num_examples: 10570
download_size: 5722530
dataset_size: 26547570
---
# Dataset Card for "eval_tag_squad_v9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allandclive/Ateso_news_articles | ---
task_categories:
- text-generation
- text2text-generation
size_categories:
- n<1K
tags:
- teo
---
# Ateso News Articles
Ateso (teo) is one of the most spoken languages in Uganda
## Dataset Details
Artictles were scrapped from https://www.aicerit.co.ug |
Dovakauhm/tue | ---
license: apache-2.0
---
|
polinaeterna/push_to_hub_singe_nondefault_config | ---
dataset_info:
config_name: custom
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 160
num_examples: 10
download_size: 3650
dataset_size: 1760
builder_config:
config_name: custom
data_files:
- split: train
pattern: custom/train-*
- split: random
pattern: custom/random-*
---
# Dataset Card for "push_to_hub_singe_nondefault_config"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LahiruLowe/t0_explanation_targets_vilsonrodrigues_falcon7b_instructsharded | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 564163
num_examples: 386
download_size: 201424
dataset_size: 564163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "t0_explanation_targets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RyokoExtra/LFANIME | ---
license: cc
tags:
- art
- anime
pretty_name: LFAnime
task_categories:
- image-classification
- text-to-image
---
# Dataset Card for LFANIME
A dataset of anime frames collected by KaraKaraWitch.
## Dataset Details
### Dataset Description
LFANIME, or Low-Framerate Anime, comprises frames from Japanese animation. The dataset serves dual purposes—facilitating fine-tuning of image diffusion models and functioning as a pre-training resource. Moreover, we anticipate its utilization in image classification.
Important Note: LFAnime is not intended for watching anime. To discourage this application, we have intentionally lowered the frame rate and excluded audio from the dataset.
- **Curated by:** KaraKaraWitch
- **Funded by [optional]:** N/A
- **Shared by [optional]:** N/A
- **Language(s) (NLP):** Nil. Primarily japanese, but no audio is included.
- **License:** CC
## Uses
A tar file compresses each "Episode," encompassing sequential anime frames. The dataset also incorporates chapters for episodes that have them. It's important to note that certain frame numbers may be absent intentionally.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
We release this dataset for free in the hopes that it could be used for text to image generation and/or image classification.
### Out-of-Scope Use
Technically speaking, this dataset could be used to watch anime. However we do not recommend as such.
Additionally there could be unforseen usage that the author does not intend.
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Each tar file should generally follow this format `LFAnime-[T(Test),A(Alpha),B(Beta),R(Release)]-[Sequential Index]-[AnilistID]-[Episode]`
Each tar file should contain:
```
frame_[XXXX]_[detection_type]_[seconds (float)].jpg
kframes.log (scxvid keyframe log)
metadata.json (Selected frames + Detection metrics + Mode)
```
`detection_type` can be one of the following:
```
- key (KeyFrame)
- p_key (Previous Frame from Key Frame)
- inter (Inter frame)
```
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The emphasis has been on developing models for generating images from text, particularly in the realm of creating "anime"-style visuals.
Examples of such models include Waifu Diffusion and NovelAI's SD 1.x models. Regrettably, these models tend to converge, resulting in a consistent aesthetic.
While this aesthetic may appeal to many users, it poses a challenge when attempting to diverge from or fine-tune the ingrained visual style of most SD 1.x models.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
We've opted not to reveal the specific origins of the anime to establish a level of separation between the producers and this dataset.
Nevertheless, we can outline the processing steps as follows:
1. Extract frames from the mkv file, sampling every 10 frames per second.
2. Utilize scxvid to generate a timecode for identifying scene cuts.
3. Exclude frames that precede or follow a scene cut (considering potential inclusion of 1/2 frames at each scene cut).
4. Save the processed frames to a tar file.
#### Who are the source data producers?
We have decided not to disclose the exact sources.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
As this dataset is a personal collection from KaraKaraWitch, it will have tendencies to generally not "Shonen" anime and will have female protagonists in general.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset.
## Citation [optional]
```
@misc{lfanime,
title = {LFAnime: A Low Framerate anime dataset.},
author = {KaraKaraWitch},
year = {2023},
howpublished = {\url{https://huggingface.co/datasets/RyokoExtra/LFANIME}},
}
```
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
Anime:
> Anime (Japanese: アニメ, IPA: [aꜜɲime]) is hand-drawn and computer-generated animation originating from Japan. Outside Japan and in English, anime refers specifically to animation produced in Japan.[1] However, in Japan and in Japanese, anime (a term derived from a shortening of the English word animation) describes all animated works, regardless of style or origin. Many works of animation with a similar style to Japanese animation are also produced outside Japan. Video games sometimes also feature themes and artstyles that can be considered as "anime".
> - Wikipedia
### Contributions
- [@KaraKaraWitch (Twitter)](https://twitter.com/KaraKaraWitch) for gathering this dataset.
- [ChatGPT](https://chat.openai.com) rewording sentences in this datacard. |
duongttr/chebi-20 | ---
dataset_info:
features:
- name: ID
dtype: string
- name: DESCRIPTION
dtype: string
- name: SELFIES
dtype: string
- name: CAN_SMILES
dtype: string
- name: IMAGE
dtype: image
splits:
- name: train
num_bytes: 484891836.0
num_examples: 26408
- name: validation
num_bytes: 60854977.375
num_examples: 3301
- name: test
num_bytes: 59648443.375
num_examples: 3301
download_size: 585165876
dataset_size: 605395256.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
joey234/mmlu-high_school_european_history-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 35805
num_examples: 5
- name: test
num_bytes: 1243562
num_examples: 165
download_size: 66756
dataset_size: 1279367
---
# Dataset Card for "mmlu-high_school_european_history-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/ultrafeedback-capybara-mix-5k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: source
dtype: string
- name: conversation
list:
- name: input
dtype: string
- name: output
dtype: string
- name: original_response
dtype: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: new_generations
sequence: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rating_chosen
dtype: int64
- name: rating_rejected
dtype: int64
- name: chosen_model
dtype: string
- name: rejected_model
dtype: string
- name: turns
dtype: int64
- name: chosen-rating
dtype: float64
- name: chosen-model
dtype: string
- name: rejected-rating
dtype: float64
- name: rejected-model
dtype: string
splits:
- name: train
num_bytes: 40207332.027373314
num_examples: 4500
- name: test
num_bytes: 4467481.336374813
num_examples: 500
download_size: 60430325
dataset_size: 44674813.363748126
---
# Dataset Card for "ultrafeedback-capybara-mix-5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/all_4_u_gnome | ---
dataset_info:
features:
- name: images
dtype: image
splits:
- name: train
num_bytes: 27477492.0
num_examples: 57
download_size: 27463507
dataset_size: 27477492.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gguichard/myriade_noun_aligned_with_wordnet_v2 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 98888979
num_examples: 162516
download_size: 22776318
dataset_size: 98888979
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "myriade_noun_aligned_with_wordnet_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-xsum-69daf1dd-12935739 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: sshleifer/distilbart-cnn-12-6
metrics: ['bleu']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: sshleifer/distilbart-cnn-12-6
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model. |
gorabbani/llama2_7b_chat | ---
license: llama2
---
|
Muhammad2003/Big_Pretrain_11K | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 32927632
num_examples: 11707
download_size: 10674173
dataset_size: 32927632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DeliberatorArchiver/movie_binaries_0010 | ---
viewer: false
--- |
JihyukKim/eli5-subquestion-d2-paired-sft | ---
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
- name: gold_claims
sequence: string
- name: response_j_claims
sequence: string
- name: response_k_claims
sequence: string
splits:
- name: train
num_bytes: 19891518
num_examples: 16494
- name: test
num_bytes: 382602
num_examples: 317
download_size: 6291749
dataset_size: 20274120
---
# Dataset Card for "eli5-subquestion-d2-paired-sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Felladrin__TinyMistral-248M-Chat-v2 | ---
pretty_name: Evaluation run of Felladrin/TinyMistral-248M-Chat-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Felladrin/TinyMistral-248M-Chat-v2](https://huggingface.co/Felladrin/TinyMistral-248M-Chat-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Felladrin__TinyMistral-248M-Chat-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T21:32:49.022074](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__TinyMistral-248M-Chat-v2/blob/main/results_2024-04-02T21-32-49.022074.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23548173558413027,\n\
\ \"acc_stderr\": 0.029992044204323092,\n \"acc_norm\": 0.2358268812315828,\n\
\ \"acc_norm_stderr\": 0.030783740280310598,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.41322892036563413,\n\
\ \"mc2_stderr\": 0.015009018402062363\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19539249146757678,\n \"acc_stderr\": 0.011586907189952911,\n\
\ \"acc_norm\": 0.23293515358361774,\n \"acc_norm_stderr\": 0.01235250704261741\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2719577773351922,\n\
\ \"acc_stderr\": 0.004440588618232711,\n \"acc_norm\": 0.27394941246763593,\n\
\ \"acc_norm_stderr\": 0.004450718673552664\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827842,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827842\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194978,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194978\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.025091892378859275,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.025091892378859275\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178253,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178253\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.02093244577446318,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.02093244577446318\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095931,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095931\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958948,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691943,\n \"\
acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691943\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n\
\ \"acc_stderr\": 0.027584066602208263,\n \"acc_norm\": 0.21524663677130046,\n\
\ \"acc_norm_stderr\": 0.027584066602208263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591206,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395967,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395967\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321628,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n\
\ \"acc_stderr\": 0.01479650262256255,\n \"acc_norm\": 0.2670391061452514,\n\
\ \"acc_norm_stderr\": 0.01479650262256255\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.022122439772480764,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.022122439772480764\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.010976425013113899,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.010976425013113899\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1801470588235294,\n \"acc_stderr\": 0.02334516361654486,\n\
\ \"acc_norm\": 0.1801470588235294,\n \"acc_norm_stderr\": 0.02334516361654486\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322267,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.03175554786629921,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.03175554786629921\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.41322892036563413,\n\
\ \"mc2_stderr\": 0.015009018402062363\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.49013417521704816,\n \"acc_stderr\": 0.014049749833367596\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Felladrin/TinyMistral-248M-Chat-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|arc:challenge|25_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|gsm8k|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hellaswag|10_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-32-49.022074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T21-32-49.022074.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- '**/details_harness|winogrande|5_2024-04-02T21-32-49.022074.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T21-32-49.022074.parquet'
- config_name: results
data_files:
- split: 2024_04_02T21_32_49.022074
path:
- results_2024-04-02T21-32-49.022074.parquet
- split: latest
path:
- results_2024-04-02T21-32-49.022074.parquet
---
# Dataset Card for Evaluation run of Felladrin/TinyMistral-248M-Chat-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Felladrin/TinyMistral-248M-Chat-v2](https://huggingface.co/Felladrin/TinyMistral-248M-Chat-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Felladrin__TinyMistral-248M-Chat-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T21:32:49.022074](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__TinyMistral-248M-Chat-v2/blob/main/results_2024-04-02T21-32-49.022074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23548173558413027,
"acc_stderr": 0.029992044204323092,
"acc_norm": 0.2358268812315828,
"acc_norm_stderr": 0.030783740280310598,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.41322892036563413,
"mc2_stderr": 0.015009018402062363
},
"harness|arc:challenge|25": {
"acc": 0.19539249146757678,
"acc_stderr": 0.011586907189952911,
"acc_norm": 0.23293515358361774,
"acc_norm_stderr": 0.01235250704261741
},
"harness|hellaswag|10": {
"acc": 0.2719577773351922,
"acc_stderr": 0.004440588618232711,
"acc_norm": 0.27394941246763593,
"acc_norm_stderr": 0.004450718673552664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827842,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827842
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0309528902177499,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0309528902177499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194978,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194978
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.15,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178253,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178253
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02093244577446318,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02093244577446318
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095931,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095931
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958948,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691943,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691943
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.027584066602208263,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.027584066602208263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591206,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395967,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395967
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.01479650262256255,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.01479650262256255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480764,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480764
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113899,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113899
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1801470588235294,
"acc_stderr": 0.02334516361654486,
"acc_norm": 0.1801470588235294,
"acc_norm_stderr": 0.02334516361654486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322267,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629921,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629921
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.41322892036563413,
"mc2_stderr": 0.015009018402062363
},
"harness|winogrande|5": {
"acc": 0.49013417521704816,
"acc_stderr": 0.014049749833367596
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
misclassified/meps_speeches_with_translation.csv | ---
license: apache-2.0
---
|
Solshine/Biodiversity_In_National_Parks | ---
license: cc
---
About Dataset
Context
The National Park Service publishes a database of animal and plant species identified in individual national parks and verified by evidence — observations, vouchers, or reports that document the presence of a species in a park. All park species records are available to the public on the National Park Species portal; exceptions are made for sensitive, threatened, or endangered species when widespread distribution of information could pose a risk to the species in the park.
Content
National Park species lists provide information on the presence and status of species in our national parks. These species lists are works in progress and the absence of a species from a list does not necessarily mean the species is absent from a park. The time and effort spent on species inventories varies from park to park, which may result in data gaps. Species taxonomy changes over time and reflects regional variations or preferences; therefore, records may be listed under a different species name.
Each park species record includes a species ID, park name, taxonomic information, scientific name, one or more common names, record status, occurrence (verification of species presence in park), nativeness (species native or foreign to park), abundance (presence and visibility of species in park), seasonality (season and nature of presence in park), and conservation status (species classification according to US Fish & Wildlife Service). Taxonomic classes have been translated from Latin to English for species categorization; order, family, and scientific name (genus, species, subspecies) are in Latin.
Acknowledgements
The National Park Service species list database is managed and updated by staff at individual national parks and the systemwide Inventory and Monitoring department.
Source: https://irma.nps.gov/NPSpecies
Also available on Kaggle: https://www.kaggle.com/datasets/nationalparkservice/park-biodiversity
Users interested in getting this data via web services, please go to: http://irmaservices.nps.gov |
CyberHarem/privaty_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of privaty/プリバティ/普丽瓦蒂/프리바티 (Nikke: Goddess of Victory)
This is the dataset of privaty/プリバティ/普丽瓦蒂/프리바티 (Nikke: Goddess of Victory), containing 121 images and their tags.
The core tags of this character are `long_hair, twintails, yellow_eyes, bangs, breasts, blue_hair, hat, very_long_hair, large_breasts, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 121 | 196.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/privaty_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 121 | 96.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/privaty_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 277 | 202.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/privaty_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 121 | 164.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/privaty_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 277 | 312.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/privaty_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/privaty_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, military_uniform, solo, looking_at_viewer, open_mouth, bodysuit, long_sleeves, military_hat, belt, blush, thighhighs |
| 1 | 13 |  |  |  |  |  | 1girl, blush, looking_at_viewer, looking_back, military_uniform, peaked_cap, solo, white_thighhighs, blunt_bangs, epaulettes, military_hat, white_shorts, from_behind, white_jacket, belt, short_shorts, bent_over, closed_mouth, long_sleeves, ass_focus, thighs, white_hair |
| 2 | 7 |  |  |  |  |  | 1girl, blunt_bangs, blush, from_behind, looking_at_viewer, looking_back, military_hat, military_uniform, peaked_cap, solo, white_thighhighs, ass_focus, white_panties, all_fours, white_jacket, belt, epaulettes, indoors, military_jacket, open_mouth, thighs |
| 3 | 7 |  |  |  |  |  | long_sleeves, 1girl, black_footwear, closed_mouth, holding_gun, looking_at_viewer, medium_breasts, solo, thigh_boots, assault_rifle, bodysuit, high_heel_boots, military, peaked_cap, standing, black_thighhighs, full_body, smile, uniform, white_jacket |
| 4 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, open_mouth, penis, solo_focus, blush, sweat, looking_at_viewer, from_behind, mosaic_censoring, pov, sex, vaginal, anus, ass_focus, looking_back, motion_lines, smile |
| 5 | 17 |  |  |  |  |  | 1girl, blush, looking_at_viewer, bare_shoulders, cleavage, solo, blue_dress, hair_flower, official_alternate_costume, ribbon, closed_mouth, collarbone, nipples |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | military_uniform | solo | looking_at_viewer | open_mouth | bodysuit | long_sleeves | military_hat | belt | blush | thighhighs | looking_back | peaked_cap | white_thighhighs | blunt_bangs | epaulettes | white_shorts | from_behind | white_jacket | short_shorts | bent_over | closed_mouth | ass_focus | thighs | white_hair | white_panties | all_fours | indoors | military_jacket | black_footwear | holding_gun | medium_breasts | thigh_boots | assault_rifle | high_heel_boots | military | standing | black_thighhighs | full_body | smile | uniform | 1boy | hetero | penis | solo_focus | sweat | mosaic_censoring | pov | sex | vaginal | anus | motion_lines | bare_shoulders | cleavage | blue_dress | hair_flower | official_alternate_costume | ribbon | collarbone | nipples |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:-------|:--------------------|:-------------|:-----------|:---------------|:---------------|:-------|:--------|:-------------|:---------------|:-------------|:-------------------|:--------------|:-------------|:---------------|:--------------|:---------------|:---------------|:------------|:---------------|:------------|:---------|:-------------|:----------------|:------------|:----------|:------------------|:-----------------|:--------------|:-----------------|:--------------|:----------------|:------------------|:-----------|:-----------|:-------------------|:------------|:--------|:----------|:-------|:---------|:--------|:-------------|:--------|:-------------------|:------|:------|:----------|:-------|:---------------|:-----------------|:-----------|:-------------|:--------------|:-----------------------------|:---------|:-------------|:----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | | | X | X | X | | X | X | X | X | X | | X | X | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | X | | X | X | | | | | | X | | | | | | X | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | X | | | | | X | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 5 | 17 |  |  |  |  |  | X | | X | X | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
pharaouk/cortex_zz2 | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: responses
dtype: string
splits:
- name: train
num_bytes: 824395111
num_examples: 351328
download_size: 432202127
dataset_size: 824395111
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder | ---
pretty_name: Evaluation run of qblocks/mistral_7b_DolphinCoder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [qblocks/mistral_7b_DolphinCoder](https://huggingface.co/qblocks/mistral_7b_DolphinCoder)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T08:38:41.844099](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder/blob/main/results_2024-01-05T08-38-41.844099.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5955742366975546,\n\
\ \"acc_stderr\": 0.032892026757812796,\n \"acc_norm\": 0.6023520874451797,\n\
\ \"acc_norm_stderr\": 0.03357558761791825,\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.43954153886534947,\n\
\ \"mc2_stderr\": 0.014894783303440727\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196204,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.628460466042621,\n\
\ \"acc_stderr\": 0.004822286556305222,\n \"acc_norm\": 0.8163712407886875,\n\
\ \"acc_norm_stderr\": 0.003863898546941602\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517418,\n \"\
acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517418\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"\
acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n\
\ \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.033809398139433545,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.033809398139433545\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335428,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335428\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709583,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666787,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579922,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354015,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354015\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.43954153886534947,\n\
\ \"mc2_stderr\": 0.014894783303440727\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708267\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2623199393479909,\n \
\ \"acc_stderr\": 0.012116912419925704\n }\n}\n```"
repo_url: https://huggingface.co/qblocks/mistral_7b_DolphinCoder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|arc:challenge|25_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|gsm8k|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hellaswag|10_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T08-38-41.844099.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- '**/details_harness|winogrande|5_2024-01-05T08-38-41.844099.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T08-38-41.844099.parquet'
- config_name: results
data_files:
- split: 2024_01_05T08_38_41.844099
path:
- results_2024-01-05T08-38-41.844099.parquet
- split: latest
path:
- results_2024-01-05T08-38-41.844099.parquet
---
# Dataset Card for Evaluation run of qblocks/mistral_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [qblocks/mistral_7b_DolphinCoder](https://huggingface.co/qblocks/mistral_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T08:38:41.844099](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder/blob/main/results_2024-01-05T08-38-41.844099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5955742366975546,
"acc_stderr": 0.032892026757812796,
"acc_norm": 0.6023520874451797,
"acc_norm_stderr": 0.03357558761791825,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.43954153886534947,
"mc2_stderr": 0.014894783303440727
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196204,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790149
},
"harness|hellaswag|10": {
"acc": 0.628460466042621,
"acc_stderr": 0.004822286556305222,
"acc_norm": 0.8163712407886875,
"acc_norm_stderr": 0.003863898546941602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.0416656757710158,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.0416656757710158
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250948,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250948
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.033809398139433545,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.033809398139433545
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335428,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335428
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709583,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666787,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354015,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354015
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.43954153886534947,
"mc2_stderr": 0.014894783303440727
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708267
},
"harness|gsm8k|5": {
"acc": 0.2623199393479909,
"acc_stderr": 0.012116912419925704
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
perlthoughts/gefilte-fish | ---
license: apache-2.0
---
code
```python
# used when training samples do not include a system prompt.
DEFAULT_SYSTEM_PROMPT = "Below is an instruction that describes a task. Write a response that appropriately completes the request."
# did not add item to dataset if prompt or system prompt contains any of these bad words.
BAD_WORDS = [
"english", "translate", "russian", "chinese", "japanese", "spanish", "persian", "french", "german", "italian", "korean",
"arabic", "hindi", "portuguese", "turkish", "vietnamese", "indonesian", "thai", "polish", "dutch", "greek", "czech",
"romanian", "swedish", "danish", "finnish", "hungarian", "norwegian", "slovak", "slovenian", "lithuanian", "latvian",
"estonian", "bulgarian", "serbian", "ukrainian", "belarusian", "croatian", "bosnian", "macedonian", "albanian", "icelandic",
"irish", "welsh", "scottish", "latin", "esperanto", "hebrew", "yiddish", "afrikaans", "swahili", "zulu", "xhosa", "sotho",
"sesotho", "somali", "hausa", "igbo", "yoruba", "malay", "tagalog", "hawaiian", "maori", "mongolian", "tamil", "telugu",
"kannada", "gujarati", "marathi", "punjabi", "nepali", "sinhala", "khmer", "lao", "burmese", "tibetan", "georgian",
"azerbaijani", "kurdish", "armenian", "kazakh", "uzbek", "tajik", "kirghiz", "turkmen", "tatar", "bashkir", "chechen",
"chuvash", "ossetian", "moldavian", "moldovan", "language model", " AI ", "openai", "gpt", "gpt-2", "gpt-3", "gpt2", "gpt3", "gpt4",
"gpt-4", "illegal", "harmful", "cannot provide", "yourself or others", "harm to yourself", "cannot suggest", "morals", "ethical",
"cannot answer", "can't answer", "don't know", "no answer", "no response", "i can't", "not enough information", "insufficient",
"it is not possible", "not answerable", "unfortunately", "can't answer", "am not sure", "davinci-0", "ada-0", "babbage-0", "curie-0",
]
TOTAL_ITEMS = 100000
# all datasets used and the percentage/ratio of each from the total.
DATASETS = {
"migtissera/Synthia-v1.3": {
"ratio": 0.2, "set": "train",
"system": "system", "prompt": "instruction", "output": "response",
},
"meta-math/MetaMathQA": {
"ratio": 0.1, "set": "train",
"system": DEFAULT_SYSTEM_PROMPT, "prompt": "query", "output": "response",
},
"HuggingFaceH4/ultrafeedback_binarized": {
"ratio": 0.3, "set": "train_sft",
"system": DEFAULT_SYSTEM_PROMPT, "prompt": "prompt", "output": "get_assistant(chosen)",
},
"ehartford/dolphin": {
"ratio": 0.3, "set": "train",
"system": "instruction", "prompt": "input", "output": "output",
},
"Open-Orca/OpenOrca": {
"ratio": 0.1, "set": "train",
"system": "system_prompt", "prompt": "question", "output": "response",
},
}
``` |
Mxode/Meow-Instruct-12k | ---
license: apache-2.0
task_categories:
- conversational
- text-generation
language:
- zh
pretty_name: meow-12k
size_categories:
- 10K<n<100K
---
一只猫猫的说话语录。
更长的版本见这里:[Mxode/Meow-Instruct-34k](https://huggingface.co/datasets/Mxode/Meow-Instruct-34k) |
polinaeterna/pokemon-blip-captions | ---
annotations_creators:
- machine-generated
language_creators:
- other
language:
- en
license: cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- huggan/few-shot-pokemon
task_categories:
- text-to-image
task_ids: []
pretty_name: Pokémon BLIP captions
tags: []
duplicated_from: lambdalabs/pokemon-blip-captions
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 119417305.0
num_examples: 833
download_size: 0
dataset_size: 119417305.0
---
# Dataset Card for Pokémon BLIP captions
_Dataset used to train [Pokémon text to image model](https://github.com/LambdaLabsML/examples/tree/main/stable-diffusion-finetuning)_
BLIP generated captions for Pokémon images from Few Shot Pokémon dataset introduced by _Towards Faster and Stabilized GAN Training for High-fidelity Few-shot Image Synthesis_ (FastGAN). Original images were obtained from [FastGAN-pytorch](https://github.com/odegeasslbc/FastGAN-pytorch) and captioned with the [pre-trained BLIP model](https://github.com/salesforce/BLIP).
For each row the dataset contains `image` and `text` keys. `image` is a varying size PIL jpeg, and `text` is the accompanying text caption. Only a train split is provided.
## Examples

> a drawing of a green pokemon with red eyes

> a green and yellow toy with a red nose

> a red and white ball with an angry look on its face
## Citation
If you use this dataset, please cite it as:
```
@misc{pinkney2022pokemon,
author = {Pinkney, Justin N. M.},
title = {Pokemon BLIP captions},
year={2022},
howpublished= {\url{https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions/}}
}
``` |
ddedde/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polymath707/vietnamese-handmade | ---
license: apache-2.0
---
|
BangumiBase/danshikoukouseinonichijou | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Danshi Koukousei No Nichijou
This is the image base of bangumi Danshi Koukousei no Nichijou, we detected 25 characters, 1831 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 320 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 127 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 364 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 29 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 75 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 106 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 20 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 54 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 61 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 69 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 21 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 21 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 54 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 9 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 46 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 229 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 29 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 36 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 56 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 7 | [Download](19/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 20 | 12 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 28 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 7 | [Download](22/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 23 | 7 | [Download](23/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 44 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Awiny/Howto-Interlink7M | ---
license: apache-2.0
---
# Howto-Interlink7M
## 📙 Overview
Howto-Interlink7M presents a unique interleaved video-text dataset, carefully derived from the raw video content of [Howto100M](https://www.di.ens.fr/willow/research/howto100m/).
<img src="howto_interlink7m_ppl.png" width="75%" height="75%">
In the creation of this dataset, we turn **a long video into a vision-text interleaved documents** by BLIP2 (Img Captioner), GRIT (Img Detector), Whisper (ASR). Similar to [VLog](https://github.com/showlab/VLog).
Then, we employed the **GPT-4** for an extensive **7 million** high-quality pretraining data.
During this process, we meticulously filtered out clips containing sensitive or low-quality content.
<img src="https://cdn-uploads.huggingface.co/production/uploads/64440be5af034cdfd69ca3a7/tCl0r7zasZwwV1qJF1OJN.png" width="50%" height="50%">
## 📊 Statistics
The statictis are listed below:
| Split | Samples | Average Clips | Average Clip Length | Average Document Tokens |
|---|---|---|---| --- |
| Howto-Interlink7M_subset_w_all_clips_train.tsv | 276711 | 8.4 | 49.8 | 460.3 |
| Howto-Interlink7M_subset_w_all_clips_val.tsv | 30746 | 8.4 | 49.8 | 460.2 |
| Howto-Interlink7M_subset_w_sampled_clips_train.tsv | 660827 | 5.8 | 47.2 |319.4 |
| Howto-Interlink7M_sbset_w_sampled_clips_val.tsv| 73426| 5.8 | 47.2 | 319.8 |
|All| 1041710| 6.6 | 48.0 | 361.0|
## 🎨 Visualization

Please see [Youtube](https://www.youtube.com/watch?v=z3uOI6oInto) for more examples.
## 🏋️ Training
Please refer to code [cosmo](https://github.com/showlab/cosmo/) for training details.
## Download Source Video
### 1. Download the README and All-in-One zip file:
On the official website [HowTo100M](https://www.di.ens.fr/willow/research/howto100m/), locate the download links for the README and the All-in-One zip file.
Extract the contents of the All-in-One zip file:
### 2. Inside the extracted folder, you should find the HowTo100M_v1.csv file.
### 3. In the CSV file, you will find a column named "video_id" which contains unique identifiers for each video.
You can use youtube-dl or similar tools to download the videos using the video IDs listed in the CSV file.
## 🎓 Citation
```
@article{wang2024cosmo,
title={COSMO: Contrastive Streamlined Multimodal Model with Interleaved Pre-Training},
author={Wang, Alex Jinpeng and Li, Linjie and Lin, Kevin Qinghong and Wang Jianfeng and Lin, Kevin and Yang, Zhengyuan and Wang, Lijuan and Shou, Mike Zheng},
journal={arXiv preprint arXiv:2401.00849},
year={2024}
}
``` |
autoevaluate/autoeval-eval-xsum-default-5ccdc1-40225145066 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: stacked-summaries/flan-t5-large-stacked-xsum-1024
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: stacked-summaries/flan-t5-large-stacked-xsum-1024
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
gaizerick/samira | ---
license: openrail
---
|
sidhellman/earnings | ---
license: apache-2.0
---
|
liyucheng/chinese_metaphor_dataset | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- zh
tags:
- metaphor
- figurative language
pretty_name: CMC
size_categories:
- 1K<n<10K
---
# Chinese Metaphor Corpus (CMC)
## Dataset Description
- **Homepage:** https://github.com/liyucheng09/Metaphor_Generator
- **Repository:** https://github.com/liyucheng09/Metaphor_Generator
- **Paper:** CM-Gen: A Neural Framework for Chinese Metaphor Generation with Explicit Context Modelling
- **Leaderboard:**
- **Point of Contact:** liyucheng09@gmail.com
### Dataset Summary
The first Chinese metaphor corpus serving both metaphor identification and generation. We construct a big metaphor resoruce in Chinese with around 9000 metaphorical sentences with tenor and vehicle annotated. Check out more details in the [github repo](https://github.com/liyucheng09/Metaphor_Generator) and our [paper](https://aclanthology.org/2022.coling-1.563/) presenting at COLING 2022.
首个中文比喻数据集,可以用于中文比喻识别与中文比喻生成。在[知乎](https://zhuanlan.zhihu.com/p/572740322)查看更多细节。
### Languages
Chinese
### Citation Information
```
@inproceedings{li-etal-2022-cm,
title = "{CM}-Gen: A Neural Framework for {C}hinese Metaphor Generation with Explicit Context Modelling",
author = "Li, Yucheng and
Lin, Chenghua and
Guerin, Frank",
booktitle = "Proceedings of the 29th International Conference on Computational Linguistics",
month = oct,
year = "2022",
address = "Gyeongju, Republic of Korea",
publisher = "International Committee on Computational Linguistics",
url = "https://aclanthology.org/2022.coling-1.563",
pages = "6468--6479",
}
``` |
joey234/mmlu-management-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 7492
num_examples: 25
download_size: 8550
dataset_size: 7492
---
# Dataset Card for "mmlu-management-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/seiran_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of seiran/清蘭/세이란 (Touhou)
This is the dataset of seiran/清蘭/세이란 (Touhou), containing 500 images and their tags.
The core tags of this character are `animal_ears, rabbit_ears, blue_hair, red_eyes, long_hair, bangs, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 582.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seiran_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 350.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seiran_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1189 | 751.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seiran_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 526.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seiran_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1189 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/seiran_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/seiran_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blue_dress, crescent_print, earclip, frills, holding, kine, puffy_short_sleeves, solo, star_print, white_background, blush, closed_mouth, looking_at_viewer, smile, simple_background |
| 1 | 6 |  |  |  |  |  | 1girl, blue_dress, crescent_print, earclip, frills, holding, kine, open_mouth, puffy_short_sleeves, smile, solo, star_print, blush, hair_between_eyes, one-hour_drawing_challenge, rabbit_tail |
| 2 | 6 |  |  |  |  |  | 1girl, blue_dress, blush, hair_between_eyes, puffy_short_sleeves, simple_background, solo, white_background, looking_at_viewer, open_mouth, blue_skirt, upper_body |
| 3 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, playboy_bunny, rabbit_tail, blush, wrist_cuffs, bare_shoulders, alternate_costume, bowtie, detached_collar, earclip, simple_background, covered_navel, white_background, ass, black_pantyhose, blue_leotard, closed_mouth, large_breasts, low_twintails, open_mouth, smile, standing |
| 4 | 17 |  |  |  |  |  | white_shirt, collared_shirt, red_necktie, long_sleeves, 1girl, looking_at_viewer, black_jacket, blazer, pink_skirt, solo, smile, hair_between_eyes, pleated_skirt, closed_mouth, purple_hair, standing, blush, crescent_pin, multiple_girls, open_mouth |
| 5 | 7 |  |  |  |  |  | 1girl, blush, hetero, 1boy, nipples, open_mouth, penis, solo_focus, nude, pussy, small_breasts, earclip, mosaic_censoring, navel, sex, tears, blue_dress, cum, looking_at_viewer, star_print, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_dress | crescent_print | earclip | frills | holding | kine | puffy_short_sleeves | solo | star_print | white_background | blush | closed_mouth | looking_at_viewer | smile | simple_background | open_mouth | hair_between_eyes | one-hour_drawing_challenge | rabbit_tail | blue_skirt | upper_body | playboy_bunny | wrist_cuffs | bare_shoulders | alternate_costume | bowtie | detached_collar | covered_navel | ass | black_pantyhose | blue_leotard | large_breasts | low_twintails | standing | white_shirt | collared_shirt | red_necktie | long_sleeves | black_jacket | blazer | pink_skirt | pleated_skirt | purple_hair | crescent_pin | multiple_girls | hetero | 1boy | nipples | penis | solo_focus | nude | pussy | small_breasts | mosaic_censoring | navel | sex | tears | cum | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-----------------|:----------|:---------|:----------|:-------|:----------------------|:-------|:-------------|:-------------------|:--------|:---------------|:--------------------|:--------|:--------------------|:-------------|:--------------------|:-----------------------------|:--------------|:-------------|:-------------|:----------------|:--------------|:-----------------|:--------------------|:---------|:------------------|:----------------|:------|:------------------|:---------------|:----------------|:----------------|:-----------|:--------------|:-----------------|:--------------|:---------------|:---------------|:---------|:-------------|:----------------|:--------------|:---------------|:-----------------|:---------|:-------|:----------|:--------|:-------------|:-------|:--------|:----------------|:-------------------|:--------|:------|:--------|:------|:----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | | | | | X | X | | X | X | | X | | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | X | | | | | X | | X | X | X | X | X | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 17 |  |  |  |  |  | X | | | | | | | | X | | | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | | X | | | | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mila-intel/ProtST-GeneOntology-MF | ---
license: apache-2.0
---
|
vwxyzjn/cai-conversation-prod-h4 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: prompt
dtype: string
- name: init_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: init_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 608100382
num_examples: 160800
- name: test
num_bytes: 32621318
num_examples: 8552
download_size: 288349996
dataset_size: 640721700
---
# Dataset Card for "cai-conversation-prod-h4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jacobbieker/goes-imerg-42hour-test | ---
license: mit
---
|
gsstein/raw_summ_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: raw_summary
dtype: string
splits:
- name: train
num_bytes: 49752
num_examples: 5
- name: test
num_bytes: 53226
num_examples: 5
- name: validation
num_bytes: 36941
num_examples: 5
download_size: 185993
dataset_size: 139919
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
qgiaohc/twitter_dataset_1713116109 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22481
num_examples: 51
download_size: 12888
dataset_size: 22481
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
VARAG/VARAG_Dataset | ---
license: mit
---
|
pragnakalp/squad_v2_french_translated | ---
language: fr
license: apache-2.0
multilinguality:
- monolingual
- translation
---
Using Google Translation, we have translated SQuAD 2.0 dataset into multiple languages.
Here is the translated dataset of SQuAD 2.0 in French language.
Shared by [Pragnakalp Techlabs](https://www.pragnakalp.com) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.