datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
fathyshalab/MDCSI_oeffentlichkeit-soziales | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 58929
num_examples: 108
- name: test
num_bytes: 14508
num_examples: 27
download_size: 45545
dataset_size: 73437
---
# Dataset Card for "reklamation24_oeffentlichkeit-soziales-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Abhineet/FlagSeg | ---
license: cc-by-4.0
task_categories:
- image-segmentation
- object-detection
language:
- en
pretty_name: Multi-country Flag Segmentation and detection dataset
size_categories:
- 10K<n<100K
---
if you find this datset useful and use it in your work please cite:
@inproceedings{wu2021flagdetseg,
title={FlagDetSeg: Multi-nation flag detection and segmentation in the wild},
author={Wu, Shou-Fang and Chang, Ming-Ching and Lyu, Siwei and Wong, Cheng-Shih and Pandey, Abhineet Kumar and Su, Po-Chi},
booktitle={2021 17th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS)},
pages={1--8},
year={2021},
organization={IEEE}
} |
madhaviit/llama10kT1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 14141408
num_examples: 10000
download_size: 4940492
dataset_size: 14141408
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
GGital/ChestXRAY-3 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 280355669.22
num_examples: 5228
download_size: 286999789
dataset_size: 280355669.22
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NomeIncrivel/Spok | ---
license: openrail
---
|
cakiki/fr_wikipedia_trigrams | ---
language: fr
--- |
kurehamnm/dataset | ---
license: unknown
---
|
opennyaiorg/aalap_instruction_dataset | ---
language:
- en
license: other
size_categories:
- 10K<n<100K
task_categories:
- text-generation
tags:
- legal
dataset_info:
features:
- name: input_text
dtype: string
- name: system_prompt
dtype: string
- name: user_prompt
dtype: string
- name: output_text
dtype: string
- name: task
dtype: string
- name: combined_input_prompt
dtype: string
splits:
- name: train
num_bytes: 188932482
num_examples: 21178
- name: test
num_bytes: 8307879
num_examples: 1094
download_size: 97051698
dataset_size: 197240361
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Aalap Instruction dataset
<!-- Provide a quick summary of the dataset. -->
This dataset aims to build an AI Assistant for Legal and Paralegal functions in India (Aalap). The main objective behind creating Aalap was to train a small-sized LLM specializing in specific legal tasks focusing on legal reasoning. Hence, prioritizing which legal tasks to focus on was an important step.
We discussed with Legal practitioners which legal tasks they would want to be automated. Then, we selected the most common legal tasks for publicly available input datasets. Since the Indian judiciary operates mainly in English, all the datasets are currently in the English language.
<!--
This dataset contains the following legal tasks:
1. **Issue generation:** Based on the facts of the case, create legal issues for a court. Legal issues are the key points on which the verdict needs to be delivered.
2. **Court Argument Generation:** Based on the facts of a case, legal issues, and applicable statute names, generate arguments for a given party. Arguments for both the appellant and counterarguments for the defendant are provided.
3. **Generating event timelines of legal docs:** Extract important events and their dates from the input text description of events and output a chronologically sorted event list with date and a brief event description.
4. **Combining multiple event timelines of legal docs:** Merging different event timelines and sorting the final timeline chronologically.
5. **Court Judgment Summaries (Headnotes):** Create a court judgment summary based on the input judgment text.
6. **Statute Ingredients:** Break the input statute definition into the ingredients that are needed for the statute to be applicable.
This dataset leverages the following publicly available datasets.
7. **Legalbench Training Data Explanations:** ORCA style explanation of the [legalbench training data](https://huggingface.co/datasets/nguha/legalbench) created by GPT4.
8. **Contract clause generation and modification:** Generation of new contract clauses and modification of existing contract clauses. [Data](https://huggingface.co/datasets/NebulaSense/Legal_Clause_Instructions)
9. **Legal Open ORCA**: Filtered the open orca dataset [1M GPT4 instructions data](https://huggingface.co/datasets/Open-Orca/OpenOrca/blob/main/1M-GPT4-Augmented.parquet) for law-related data.
10. **Natural Instructions filtered for law**: Random sample of [Natural Instructions dataset](https://github.com/allenai/natural-instructions/tree/master/tasks) filtered for law
While the focus of this dataset is Indian legal tasks, it can be used to train models that work in other countries that follow common law.
## How were these tasks chosen?
We discussed with Legal practitioners about which legal tasks they would want to be automated. Based on that feedback and the availability of the dataset, we selected these legal tasks to focus on.
-->
## Common Data Sources
The following are the commonly used data sources for preparing the input instructions.
1. **Judgments Facts**: Case facts are typically written in a court judgment. Such facts from judgments texts were extracted using Opennyai's rhetorical roles model \cite{kalamkarcorpus} on randomly chosen Indian Supreme Court \& high court judgments. These judgment facts were used as inputs for multiple tasks like argument generation, issues generation, and event timeline creation.
2. **Judgments Statutes**: Relevant Statutes of a judgment are the ones that were extracted from corresponding judgment text using Opennyai's Named Entity Recognition model \cite{kalamkar2022named}.
3. **FIRs**: Publicly available police First Investigation Reports from the states of Maharashtra and Delhi were used to collect descriptions of real-life events. These FIRs were used to create event timeline creation tasks.
## Dataset Details
<!--
| Data Name | Source | Size | License | Description |
| ----------------------------------------- | ---------------------------------------------------------------------------------------- | ----- | ----------- |----------- |
| issues_generation | Judgments | 601 | CC0-1.0 | Facts of the case are extracted using Opennyai's [rhetorical roles model](https://github.com/OpenNyAI/Opennyai) on randomly chosen Indian Supreme Court & high court judgments. These facts were then sent to gpt-3.5-turbo for generating legal issues on which the court needs to make decisions.|
| argument_generation | Judgements | 1200 | CC0-1.0 | The same sample court judgments used for Issues Generation were used for generating arguments. Arguments for petitioners were created using chatgpt-3.5-turbo based on the facts of the case, chatgpt-3.5-turbo, earlier generated legal issues and statutes names extracted using Opennyai's [Named Entity Recognition](https://github.com/Legal-NLP-EkStep/legal_NER) model. Counterarguments for defendants were created using these arguments for the petitioner, legal issues, facts of the case, and statute names. |
| event_timeline | Judgments & FIR | 725 | CC0-1.0 | Based on the input text, which contains a sequence of events, create brief event descriptions along with the dates and chronologically sort them. |
| combine_event_timeline | Judgments & FIR | 204 | CC0-1.0 | For extraction event timelines from very long texts, it is often split into chunks, and event time for each chunk is created independently. This task focuses on merging of such timelines and sorting them chronologically. |
| legalbench | [Legal bench](https://huggingface.co/datasets/nguha/legalbench) | 607 | Other | Training data for the legalbench was filtered to keep only the legal reasoning tasks. Tasks that solely focus on the statute recall were excluded. ORCA-style explanations of these MCQs were created using GPT4.|
| statute_ingredients | Statutes | 526 | CC0-1.0 | Definitions of the most popular sections of Indian central acts were used for the generation of the statute ingredients. e.g., Example: Section 420 IPC: "Whoever cheats and thereby dishonestly induces the person deceived to deliver any property to any person, or to make, alter or destroy the whole or any part of a valuable security, or anything which is signed or sealed, and which is capable of being converted into a valuable security, shall be punished with imprisonment of either description for a term which may extend to seven years, and shall also be liable to fine." The ingredients of this statute are "cheating", "dishonest inducement", "delivery of property to any person"," alteration of valuable security", "destruction of a valuable security", "anything which is signed", "anything which is sealed" and "capable of being converted into a valuable security"|
| summary_generation | SC headnotes | 700 | CC0-1.0 | Indian Supreme Court judgments from 1950 to 1994 are published with headnotes. These headnotes are the summaries of these judgments. A random sample of 300 judgments is taken, and the input text is the entire judgment text, and the output is the headnote. |
| legal_open_orca | [FLAN Dataset] | 8555 | MIT | [OpenORCA dataset](https://huggingface.co/datasets/Open-Orca/OpenOrca/blob/main/1M-GPT4-Augmented.parquet) is ORCA style explanations of [Natural Instructions dataset](https://github.com/allenai/natural-instructions/tree/master/tasks). The Natural Instruction dataset (NI) is filtered for law-related datasets. This filtered NI data is matched against the 1M GPT4 openORCA dataset using tf-idf matching to get legal OpenORCA data. |
| contract_clause_generation | [Contract Clauses](https://huggingface.co/datasets/NebulaSense/Legal_Clause_Instructions)| 4557 | cc-by-nc-4.0 | This dataset has two tasks of generating a contract clause and modifying an existing contract clause as per instructions. Entire data is used as is. |
| legal_niv2_mcq | Natural Instructions data | 2000 | Apache 2.0 | Random sample of NI dataset MCQs filtered for law. This helps to teach a model to give concise answers, which is helpful during evaluation.|
| incomplete_instructions | Judgments | 1546 | CC0-1.0 | These are the examples where the given information is incomplete for completing the task. E.g. LLM prompt does not specify facts of the case and asks to generate arguments. In such cases, the response is to ask for the required information. Some examples are picked up from [nisaar/Articles_Constitution_3300_Instruction_Set](https://huggingface.co/datasets/nisaar/Articles_Constitution_3300_Instruction_Set) where the precedents text need to be recalled based on the names.|
| constitution_general_knowledge | Constitution Q&A | 933 | Apache 2.0 | General Knowldege question and answers about Constitution of India from [nisaar/Constitution_of_India](https://huggingface.co/datasets/nisaar/Constitution_of_India)|
The dataset is split into train and test. The size in the above table indicates train+test size.
-->
| Task Name | Task Description | Data Creation Methodology |
|--------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Issues Generation | Based on the case facts, create legal issues for a court. Legal issues are the key points on which the verdict needs to be delivered. | Judgment facts were then sent to gpt-3.5-turbo to generate legal issues. |
| Argument Generation | Based on the facts of a case, legal issues, and applicable statute names, generate arguments for a given party. Arguments for the petitioner and counterarguments for the respondent are provided. | Arguments for the petitioners were created using gpt-3.5-turbo, using judgment facts, generated issues, and relevant statutes as inputs. Counterarguments for defendants were created using these petitioners' arguments, legal issues, case facts, and relevant statutes. |
| Event Timeline | Extract important events and their dates from the input text descriptions of events and output a chronologically sorted event list with dates and a brief event description. | FIRs and judgment facts were used as input text descriptions. These were sent to gpt-3.5-turbo to create event descriptions and dates. |
| Combine Event Timelines | For extraction of event timelines from very long texts, it is often split into chunks, and the event timeline for each chunk is created independently, which are merged later. | Individually created timelines coming from the same judgment or FIR were merged using gpt-3.5-turbo. |
| Legalbench | [Legalbench data](https://huggingface.co/datasets/nguha/legalbench) contains multiple-choice questions from 162 different legal tasks. | Training data for the legalbench was filtered to keep only the legal reasoning tasks. ORCA-style explanations of these MCQs were created using GPT4 in a zero-shot setting. |
| Statute Ingredients | Break the input statute definition into the ingredients needed to apply the statute. | Definitions of the most popular sections of Indian central acts were used to generate the statute ingredients using gpt-3.5-turbo. |
| Summary Generation | Create a summary in judgment headnotes format using the input court judgment text | Indian Supreme Court judgments from 1950 to 1994 are published with headnotes, which are summaries of those judgments. |
| Legal Open ORCA | [OpenORCA dataset]((https://huggingface.co/datasets/Open-Orca/OpenOrca/blob/main/1M-GPT4-Augmented.parquet)) is an ORCA-style explanation of the Natural Instructions dataset. | The Natural Instruction dataset (NI) is filtered for law-related datasets. This filtered NI data is matched against the 1M GPT4 openORCA dataset using tf-idf matching to get legal OpenORCA data. |
| Contract Clause Generation | Generation of new contract clauses and modification of existing contract clauses as per the instructions. | [existing data](https://huggingface.co/datasets/NebulaSense/Legal_Clause_Instructions) |
| Legal NIv2 MCQ | [Natural Instructions v2 data](https://github.com/allenai/natural-instructions) consists of multiple-choice questions about diverse topics. We filtered this data for law-related questions. | A random sample of NI dataset MCQs filtered for law. This helps to teach a model to give concise answers, which is helpful during evaluation. |
| Constitution General Knowledge | [QA about various aspects of the Indian Constitution](https://huggingface.co/datasets/nisaar/Constitution_of_India) | A about various aspects of the Indian Constitution. | \url{https://huggingface.co/datasets/nisaar/Constitution_of_India} |
| Incomplete Instructions | These are the examples where the given information is incomplete for completing the task. In such cases, the response is to ask for the required information. | Randomly selected instructions belonging to each of these tasks above where the information is incomplete. E.g., the LLM prompt does not specify the facts of the case and asks to generate arguments. |
The table below shows the summary statistics of various task categories and licenses associated with each dataset.
| Task Category | train count | test count | Average input tokens | Average output tokens | License |
|--------------------------------|----------------------|---------------------|-------------------------------|--------------------------------|------------------|
| Issue Generation | 577 | 24 | 1376 | 161 | CC0-1.0 |
| Argument Generation | 1142 | 58 | 2381 | 943 | CC0-1.0 |
| Event Timeline | 676 | 49 | 3476 | 342 | CC0-1.0 |
| Combine Event Timeline | 195 | 9 | 883 | 772 | CC0-1.0 |
| legalbench | 580 | 27 | 229 | 218 | Other |
| Statute Ingredients | 499 | 27 | 402 | 111 | CC0-1.0 |
| Summary Generation | 686 | 14 | 7635 | 1413 | CC0-1.0 |
| Legal Open ORCA | 8142 | 413 | 449 | 91 | MIT |
| Contract Clause Generation | 4325 | 232 | 76 | 179 | cc-by-nc-4.0 |
| Legal NIv2 MCQ | 1891 | 109 | 408 | 9 | Apache 2.0 |
| Constitution General Knowledge | 889 | 44 | 36 | 85 | Apache 2.0 |
| Incomplete Instructions | 1464 | 82 | 97 | 81 | CC0-1.0 |
| General Alap | 112 | 6 | 31 | 25 | CC0-1.0 |
## Guiding Principles
1. **Focus on legal reasoning rather than legal recall**: Relying on LLMs to generate legal precedents and statute definitions is a bad idea. This is because LLMs are more likely to hallucinate and generate false content. Instead, precedents information and statutes should be retrieved from authentic sources, given as input to LLMs, and let LLMs do the legal reasoning. This is in line with what humans do.
2. **Use real-life situations**: Make the datasets as close as possible to real-life situations. Hence, we used descriptions from court judgments and police First Investigation Reports (FIR) so that the model learns the language used in such texts.
3. **Explain answers**: ORCA-style explanations of answers help teach models the reasons behind coming up with answers to multiple-choice questions. We created GPT4 to create explanations wherever possible and reused relevant legal datasets that provided such explanations.
4. **Use synthetically generated responses if needed**: Many times, the actual documents, like lawyers' pleadings, witness statements, medical reports, etc., are not available publicly. So, we have used synthetically generated responses from OpenAI models like GPT4 and gpt-3.5-turbo wherever needed.
- **Curated by:** [OpenNyAI team](https://opennyai.org/)
- **License:** Since this is a compilation of multiple datasets, please refer to each of the datasets for the license information.
## Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
1) Precedent Information is missing in the argument generation. Precedents are important elements to support generated arguments. It is important to fetch the right paragraphs from the precedents based on the situation at hand to build an argument. Since we did not have access to an automatic tool that can do this, we have excluded precedents information from generated arguments.
2) Facts in the judgments are worded much differently than what is available from statements, filings, and other reports that lawyers get as inputs. Since this information is not public, we had to rely on publicly available datasets.
|
ambet/function-calling-with-dict-format | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 34020.36666666667
num_examples: 67
- name: test
num_bytes: 11678.633333333333
num_examples: 23
download_size: 17406
dataset_size: 45699.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SSEF-HG-AC/cyberbullying-instagram-balanced | ---
license: cc
---
|
Achuvin/achiii | ---
license: afl-3.0
---
|
liuyanchen1015/MULTI_VALUE_stsb_emphatic_reflex | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1059
num_examples: 5
- name: test
num_bytes: 170
num_examples: 1
- name: train
num_bytes: 1798
num_examples: 8
download_size: 11041
dataset_size: 3027
---
# Dataset Card for "MULTI_VALUE_stsb_emphatic_reflex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vikhrmodels/Flan_translated_300k | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 332929649
num_examples: 373184
download_size: 144707378
dataset_size: 332929649
license: mit
task_categories:
- question-answering
language:
- ru
---
# Dataset Card for "flan_translated_300k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Citation
```
@MISC{AlexWortega/flan_translated_300k,
author = {Pavel Ilin, Ksenia Zolian,Ilya kuleshov, Egor Kokush, Aleksandr Nikolich},
title = {Russian Flan translated},
url = {https://huggingface.co/datasets/AlexWortega/flan_translated_300k},
year = 2023
}
``` |
AaronLi/TTS-test-wavs | ---
license: mit
---
# TTS-test-wavs
For test.
|
BelleGroup/train_3.5M_CN | ---
license: gpl-3.0
task_categories:
- text2text-generation
language:
- zh
size_categories:
- 1M<n<10M
---
## 内容
包含约350万条由[BELLE](https://github.com/LianjiaTech/BELLE)项目生成的中文指令数据。
## 样例
```
{
"id":"66623028",
"conversations":[
{
"from":"human",
"value":"归纳介绍三种不同类型的电影题材"
},
{
"from":"assistant",
"value":"1. 动作片:动作片通常强调动作和战斗场面,故事情节紧凑,充满了紧张和兴奋的氛围。主角一般是具有超凡实力或特殊技能的英雄人物,往往会与反派角色进行一系列激烈的格斗,追逐和爆破等酷炫场面。例如《速度与激情》和《复仇者联盟》等。\n2. 爱情片:爱情片是以爱情为主题的电影,着重呈现角色之间的感情纠葛,展现了爱情的美好和悲壮。情节与角色之间的发展令人深思,增加了观众的感性体验。这种电影的故事情节主要以人类与人类之间的情感变化为主题。例如《泰坦尼克号》和《让子弹飞》等。\n3. 科幻片:科幻片是以未来科技和未知世界为主题的电影,通常包括时间旅行、外星人、虚拟现实、未来社会等各种奇幻的元素。这种电影描绘了一种比现实更加宏伟、更加神秘和惊奇的世界。例如《星际穿越》和《发条橙》等。"
}
]
}
{
"id":"18205437",
"conversations":[
{
"from":"human",
"value":"根据给定的食材,编写一道美食菜谱。\n鸡腿肉、蘑菇、洋葱"
},
{
"from":"assistant",
"value":"鸡肉炒蘑菇洋葱\n食材:\n- 鸡腿肉 400克,去骨去皮切成小块\n- 蘑菇 200克,切片\n- 洋葱 1个,切丝\n- 生姜 适量,切丝\n- 酱油 2勺\n- 料酒 1勺\n- 盐 适量\n- 白胡椒粉 适量\n- 植物油 适量\n做法:\n1. 在一个碗里放入切好的鸡腿肉块,加入酱油、料酒和适量的白胡椒粉,搅拌均匀,腌制10分钟。\n2. 热锅凉油,加入姜丝翻炒,放入腌好的鸡肉块翻炒至变色。\n3. 加入洋葱丝和蘑菇,翻炒2-3分钟至蘑菇变软。\n4. 加入适量的盐和少量的水,盖上锅盖焖煮5分钟直至鸡肉熟透。\n5. 最后翻炒几下,使汤汁变浓稠即可。\n温馨提示:\n1. 在腌制鸡肉时,如果想让肉质更嫩滑,可以加入少量的淀粉和蛋清。\n2. 可以根据个人口味加入其他调料,如蒜末、辣椒等。\n3. 洋葱和蘑菇可以用其他蔬菜代替,如青椒、胡萝卜等。"
},
{
"from":"human",
"value":"太好了,我会试着照着你教的做一次,但是我最近买了一部新相机,想让你推荐一些好的拍照地点。"
},
{
"from":"assistant",
"value":"当地的风景区、公园、景点等等都是比较适合拍照的地点。如果你想要一些特别的地方,可以考虑一些建筑群,街道巷子、咖啡馆、书店等具有文艺气息的地方。"
},
{
"from":"human",
"value":"嗯,这些地方都很不错,但是我还想找一些可以体验当地文化和风俗的地点,你有什么好的建议吗?"
},
{
"from":"assistant",
"value":"如果你想要更深入地了解当地文化和风俗,可以考虑参加一些游学活动,比如参观当地的博物馆、参加民俗节庆、尝试当地美食等等。此外,也可以去一些小城镇,感受那里的古朴风情。"
}
]
}
```
### 字段:
```
id: 数据id
conversations: 数据内容,以对话形式给出,包括多轮和单轮对话的数据
```
### 补充:
针对该数据集,新增了指令类别字段,共包括13个类别,详情见
[train_3.5M_CN_With_Category](https://huggingface.co/datasets/BELLE-2/train_3.5M_CN_With_Category)
## 使用限制
仅允许将此数据集及使用此数据集生成的衍生物用于研究目的,不得用于商业,以及其他会对社会带来危害的用途。
本数据集不代表任何一方的立场、利益或想法,无关任何团体的任何类型的主张。因使用本数据集带来的任何损害、纠纷,本项目不承担任何责任。 |
Baidicoot/alpaca-deployment-0.5-0.1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9574714.0
num_examples: 31323
download_size: 4358746
dataset_size: 9574714.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_itsliupeng__llama_9b_long | ---
pretty_name: Evaluation run of itsliupeng/llama_9b_long
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama_9b_long](https://huggingface.co/itsliupeng/llama_9b_long) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama_9b_long\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T04:19:31.526545](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama_9b_long/blob/main/results_2024-03-13T04-19-31.526545.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6983981584518038,\n\
\ \"acc_stderr\": 0.03079975479452849,\n \"acc_norm\": 0.7048843492764434,\n\
\ \"acc_norm_stderr\": 0.03139212193469086,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.4595165177815787,\n\
\ \"mc2_stderr\": 0.015014100990687205\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946698\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5871340370444135,\n\
\ \"acc_stderr\": 0.0049134290105590644,\n \"acc_norm\": 0.7866958773152758,\n\
\ \"acc_norm_stderr\": 0.004088034745195348\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.026880647889051975,\n\
\ \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.026880647889051975\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802267,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802267\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n\
\ \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.02977164271249123,\n\
\ \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.02977164271249123\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6137566137566137,\n \"acc_stderr\": 0.025075981767601688,\n \"\
acc_norm\": 0.6137566137566137,\n \"acc_norm_stderr\": 0.025075981767601688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\
\ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\
\ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8709677419354839,\n \"acc_stderr\": 0.01907088925479276,\n\
\ \"acc_norm\": 0.8709677419354839,\n \"acc_norm_stderr\": 0.01907088925479276\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\"\
: 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7461538461538462,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.7461538461538462,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.48148148148148145,\n \"acc_stderr\": 0.030464621718895322,\n \
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.030464621718895322\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660077,\n \
\ \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660077\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.0402614149763461,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.0402614149763461\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168585,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\
\ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\
\ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.01318222261672089,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.01318222261672089\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n\
\ \"acc_stderr\": 0.016303899530796116,\n \"acc_norm\": 0.3888268156424581,\n\
\ \"acc_norm_stderr\": 0.016303899530796116\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.549645390070922,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.500651890482399,\n\
\ \"acc_stderr\": 0.012770225252255565,\n \"acc_norm\": 0.500651890482399,\n\
\ \"acc_norm_stderr\": 0.012770225252255565\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274052,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7352941176470589,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.0267114305555384,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.0267114305555384\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759043,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759043\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.4595165177815787,\n\
\ \"mc2_stderr\": 0.015014100990687205\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48142532221379836,\n \
\ \"acc_stderr\": 0.013762977910317584\n }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama_9b_long
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|arc:challenge|25_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|arc:challenge|25_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|gsm8k|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|gsm8k|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hellaswag|10_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hellaswag|10_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T04-11-57.073661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T04-19-31.526545.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T04-19-31.526545.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- '**/details_harness|winogrande|5_2024-03-13T04-11-57.073661.parquet'
- split: 2024_03_13T04_19_31.526545
path:
- '**/details_harness|winogrande|5_2024-03-13T04-19-31.526545.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T04-19-31.526545.parquet'
- config_name: results
data_files:
- split: 2024_03_13T04_11_57.073661
path:
- results_2024-03-13T04-11-57.073661.parquet
- split: 2024_03_13T04_19_31.526545
path:
- results_2024-03-13T04-19-31.526545.parquet
- split: latest
path:
- results_2024-03-13T04-19-31.526545.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama_9b_long
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [itsliupeng/llama_9b_long](https://huggingface.co/itsliupeng/llama_9b_long) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama_9b_long",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T04:19:31.526545](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama_9b_long/blob/main/results_2024-03-13T04-19-31.526545.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6983981584518038,
"acc_stderr": 0.03079975479452849,
"acc_norm": 0.7048843492764434,
"acc_norm_stderr": 0.03139212193469086,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.4595165177815787,
"mc2_stderr": 0.015014100990687205
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946698
},
"harness|hellaswag|10": {
"acc": 0.5871340370444135,
"acc_stderr": 0.0049134290105590644,
"acc_norm": 0.7866958773152758,
"acc_norm_stderr": 0.004088034745195348
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.026880647889051975,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.026880647889051975
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802267,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802267
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.02977164271249123,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.02977164271249123
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6137566137566137,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.6137566137566137,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8709677419354839,
"acc_stderr": 0.01907088925479276,
"acc_norm": 0.8709677419354839,
"acc_norm_stderr": 0.01907088925479276
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7461538461538462,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.7461538461538462,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.030464621718895322,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.030464621718895322
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660077,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660077
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.0402614149763461,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.0402614149763461
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.015014462497168585,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.015014462497168585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.01318222261672089,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.01318222261672089
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.016303899530796116,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.016303899530796116
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.549645390070922,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.549645390070922,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.500651890482399,
"acc_stderr": 0.012770225252255565,
"acc_norm": 0.500651890482399,
"acc_norm_stderr": 0.012770225252255565
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274052,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.0267114305555384,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.0267114305555384
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759043,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759043
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.4595165177815787,
"mc2_stderr": 0.015014100990687205
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
},
"harness|gsm8k|5": {
"acc": 0.48142532221379836,
"acc_stderr": 0.013762977910317584
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
enelpol/czywiesz | ---
language:
- pl
license:
- unknown
multilinguality:
- monolingual
pretty_name: Czywiesz
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
---
This is an extract of the original [Czywiesz](https://clarin-pl.eu/dspace/handle/11321/39) dataset. It contains the questions and the relevant Wikipedia
passages in format compatible with DPR training objective. It may be used to train a passage retriever. |
MedRAG/textbooks | ---
task_categories:
- question-answering
language:
- en
tags:
- medical
- question answering
- large language model
- retrieval-augmented generation
size_categories:
- 100K<n<1M
---
# The Textbooks Corpus in MedRAG
This HF dataset contains the chunked snippets from the Textbooks corpus used in [MedRAG](https://arxiv.org/abs/2402.13178). It can be used for medical Retrieval-Augmented Generation (RAG).
## Dataset Details
### Dataset Descriptions
[Textbooks](https://github.com/jind11/MedQA) is a collection of 18 widely used medical textbooks, which are important references for students taking the United States Medical Licensing Examination (USLME).
In MedRAG, the textbooks are processed as chunks with no more than 1000 characters.
We used the RecursiveCharacterTextSplitter from [LangChain](https://www.langchain.com/) to perform the chunking.
This HF dataset contains our ready-to-use chunked snippets for the Textbooks corpus, including 125,847 snippets with an average of 182 tokens.
### Dataset Structure
Each row is a snippet of Textbooks, which includes the following features:
- id: a unique identifier of the snippet
- title: the title of the textbook from which the snippet is collected
- content: the content of the snippet
- contents: a concatenation of 'title' and 'content', which will be used by the [BM25](https://github.com/castorini/pyserini) retriever
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
```shell
git clone https://huggingface.co/datasets/MedRAG/textbooks
```
### Use in MedRAG
```python
>> from src.medrag import MedRAG
>> question = "A lesion causing compression of the facial nerve at the stylomastoid foramen will cause ipsilateral"
>> options = {
"A": "paralysis of the facial muscles.",
"B": "paralysis of the facial muscles and loss of taste.",
"C": "paralysis of the facial muscles, loss of taste and lacrimation.",
"D": "paralysis of the facial muscles, loss of taste, lacrimation and decreased salivation."
}
>> medrag = MedRAG(llm_name="OpenAI/gpt-3.5-turbo-16k", rag=True, retriever_name="MedCPT", corpus_name="Textbooks")
>> answer, snippets, scores = medrag.answer(question=question, options=options, k=32) # scores are given by the retrieval system
```
## Citation
```shell
@article{xiong2024benchmarking,
title={Benchmarking Retrieval-Augmented Generation for Medicine},
author={Guangzhi Xiong and Qiao Jin and Zhiyong Lu and Aidong Zhang},
journal={arXiv preprint arXiv:2402.13178},
year={2024}
}
``` |
elyadenysova/filtered_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3850353.3429389657
num_examples: 9538
- name: validation
num_bytes: 1292571.4899328859
num_examples: 3200
- name: test
num_bytes: 1286488.9414765101
num_examples: 3169
download_size: 1039874
dataset_size: 6429413.774348361
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_14 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1076606356.0
num_examples: 209783
download_size: 1097915450
dataset_size: 1076606356.0
---
# Dataset Card for "chunk_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BioDEX/BioDEX-ICSR | ---
dataset_info:
features:
- name: title
dtype: string
- name: abstract
dtype: string
- name: fulltext
dtype: string
- name: target
dtype: string
- name: pmid
dtype: string
- name: fulltext_license
dtype: string
- name: title_normalized
dtype: string
- name: issue
dtype: string
- name: pages
dtype: string
- name: journal
dtype: string
- name: authors
dtype: string
- name: pubdate
dtype: string
- name: doi
dtype: string
- name: affiliations
dtype: string
- name: medline_ta
dtype: string
- name: nlm_unique_id
dtype: string
- name: issn_linking
dtype: string
- name: country
dtype: string
- name: mesh_terms
dtype: string
- name: publication_types
dtype: string
- name: chemical_list
dtype: string
- name: keywords
dtype: string
- name: references
dtype: string
- name: delete
dtype: bool
- name: pmc
dtype: string
- name: other_id
dtype: string
- name: safetyreportid
dtype: int64
- name: fulltext_processed
dtype: string
splits:
- name: test
num_bytes: 155748936
num_examples: 3628
- name: train
num_bytes: 374859364
num_examples: 9624
- name: validation
num_bytes: 96385392
num_examples: 2407
download_size: 337571954
dataset_size: 626993692
---
# Dataset Card for "BioDEX-ICSR"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/bus_few4_05x_empty | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 5491
num_examples: 35
- name: validation
num_bytes: 6128
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 0
dataset_size: 82237
---
# Dataset Card for "bus_few4_05x_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Youssef11/HealthCareMagic-25k-finetuning-llama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 27960246
num_examples: 25000
download_size: 17014378
dataset_size: 27960246
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mPLUG/DocDownstream-1.0 | ---
license: apache-2.0
---
|
IgorVolochay/russian_jokes | ---
license: mit
task_categories:
- text-generation
language:
- ru
size_categories:
- 100K<n<1M
--- |
rkdeva/Dermnet-Test-1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 376769298.178
num_examples: 3937
download_size: 370140973
dataset_size: 376769298.178
---
# Dataset Card for "Dermnet-Test-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baber/logiqa2 | ---
license: cc-by-sa-4.0
task_categories:
- multiple-choice
language:
- en
- zh
pretty_name: LogiQA2.0
data_splits:
- train
- validation
- test
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:** https://github.com/csitfun/LogiQA2.0, https://github.com/csitfun/LogiEval
- **Repository:** https://github.com/csitfun/LogiQA2.0, https://github.com/csitfun/LogiEval
- **Paper:** https://ieeexplore.ieee.org/abstract/document/10174688
### Dataset Summary
Logiqa2.0 dataset - logical reasoning in MRC and NLI tasks
LogiEval: a benchmark suite for testing logical reasoning abilities of instruct-prompt large language models
### Licensing Information
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
### Citation Information
@ARTICLE{10174688,
author={Liu, Hanmeng and Liu, Jian and Cui, Leyang and Teng, Zhiyang and Duan, Nan and Zhou, Ming and Zhang, Yue},
journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing},
title={LogiQA 2.0 — An Improved Dataset for Logical Reasoning in Natural Language Understanding},
year={2023},
volume={},
number={},
pages={1-16},
doi={10.1109/TASLP.2023.3293046}}
@misc{liu2023evaluating,
title={Evaluating the Logical Reasoning Ability of ChatGPT and GPT-4},
author={Hanmeng Liu and Ruoxi Ning and Zhiyang Teng and Jian Liu and Qiji Zhou and Yue Zhang},
year={2023},
eprint={2304.03439},
archivePrefix={arXiv},
primaryClass={cs.CL}
} |
StonyBrookNLP/tellmewhy | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: null
pretty_name: TellMeWhy
---
# Dataset Card for TellMeWhy
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://stonybrooknlp.github.io/tellmewhy/
- **Repository:** https://github.com/StonyBrookNLP/tellmewhy
- **Paper:** https://aclanthology.org/2021.findings-acl.53/
- **Leaderboard:** None
- **Point of Contact:** [Yash Kumar Lal](mailto:ylal@cs.stonybrook.edu)
### Dataset Summary
TellMeWhy is a large-scale crowdsourced dataset made up of more than 30k questions and free-form answers concerning why characters in short narratives perform the actions described.
### Supported Tasks and Leaderboards
The dataset is designed to test why-question answering abilities of models when bound by local context.
### Languages
English
## Dataset Structure
### Data Instances
A typical data point consists of a story, a question and a crowdsourced answer to that question. Additionally, the instance also indicates whether the question's answer would be implicit or if it is explicitly stated in text. If applicable, it also contains Likert scores (-2 to 2) about the answer's grammaticality and validity in the given context.
```
{
"narrative":"Cam ordered a pizza and took it home. He opened the box to take out a slice. Cam discovered that the store did not cut the pizza for him. He looked for his pizza cutter but did not find it. He had to use his chef knife to cut a slice.",
"question":"Why did Cam order a pizza?",
"original_sentence_for_question":"Cam ordered a pizza and took it home.",
"narrative_lexical_overlap":0.3333333333,
"is_ques_answerable":"Not Answerable",
"answer":"Cam was hungry.",
"is_ques_answerable_annotator":"Not Answerable",
"original_narrative_form":[
"Cam ordered a pizza and took it home.",
"He opened the box to take out a slice.",
"Cam discovered that the store did not cut the pizza for him.",
"He looked for his pizza cutter but did not find it.",
"He had to use his chef knife to cut a slice."
],
"question_meta":"rocstories_narrative_41270_sentence_0_question_0",
"helpful_sentences":[
],
"human_eval":false,
"val_ann":[
],
"gram_ann":[
]
}
```
### Data Fields
- `question_meta` - Unique meta for each question in the corpus
- `narrative` - Full narrative from ROCStories. Used as the context with which the question and answer are associated
- `question` - Why question about an action or event in the narrative
- `answer` - Crowdsourced answer to the question
- `original_sentence_for_question` - Sentence in narrative from which question was generated
- `narrative_lexical_overlap` - Unigram overlap of answer with the narrative
- `is_ques_answerable` - Majority judgment by annotators on whether an answer to this question is explicitly stated in the narrative. If "Not Answerable", it is part of the Implicit-Answer questions subset, which is harder for models.
- `is_ques_answerable_annotator` - Individual annotator judgment on whether an answer to this question is explicitly stated in the narrative.
- `original_narrative_form` - ROCStories narrative as an array of its sentences
- `human_eval` - Indicates whether a question is a specific part of the test set. Models should be evaluated for their answers on these questions using the human evaluation suite released by the authors. They advocate for this human evaluation to be the correct way to track progress on this dataset.
- `val_ann` - Array of Likert scores (possible sizes are 0 and 3) about whether an answer is valid given the question and context. Empty arrays exist for cases where the human_eval flag is False.
- `gram_ann` - Array of Likert scores (possible sizes are 0 and 3) about whether an answer is grammatical. Empty arrays exist for cases where the human_eval flag is False.
### Data Splits
The data is split into training, valiudation, and test sets.
| Train | Valid | Test |
| ------ | ----- | ----- |
| 23964 | 2992 | 3563 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
ROCStories corpus (Mostafazadeh et al, 2016)
#### Initial Data Collection and Normalization
ROCStories was used to create why-questions related to actions and events in the stories.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
Amazon Mechanical Turk workers were provided a story and an associated why-question, and asked to answer. Three answers were collected for each question. For a small subset of questions, the quality of answers was also validated in a second round of annotation. This smaller subset should be used to perform human evaluation of any new models built for this dataset.
#### Who are the annotators?
Amazon Mechanical Turk workers
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Evaluation
To evaluate progress on this dataset, the authors advocate for human evaluation and release a suite with the required settings [here](https://github.com/StonyBrookNLP/tellmewhy). Once inference on the test set has been completed, please filter out the answers on which human evaluation needs to be performed by selecting the questions (one answer per question, deduplication might be needed) in the test set where the `human_eval` flag is set to `True`. This subset can then be used to complete the requisite evaluation on TellMeWhy.
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{lal-etal-2021-tellmewhy,
title = "{T}ell{M}e{W}hy: A Dataset for Answering Why-Questions in Narratives",
author = "Lal, Yash Kumar and
Chambers, Nathanael and
Mooney, Raymond and
Balasubramanian, Niranjan",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.53",
doi = "10.18653/v1/2021.findings-acl.53",
pages = "596--610",
}
```
### Contributions
Thanks to [@yklal95](https://github.com/ykl7) for adding this dataset. |
arieg/bw_spec_cls_4_00_s_clr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '140'
'2': '2'
'3': '5'
splits:
- name: train
num_bytes: 40626419.0
num_examples: 400
- name: test
num_bytes: 8127167.0
num_examples: 80
download_size: 40967933
dataset_size: 48753586.0
---
# Dataset Card for "bw_spec_cls_4_00_s_clr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Guilleber/papyrus | ---
license: apache-2.0
pretty_name: Papyrus
size_categories:
- 10K<n<100K
language:
- en
- fr
multilinguality: multilingual
tags:
- text-to-text
- keyphrase-generation
---
# Dataset Card for Papyrus
- **Paper:** [A new dataset for multilingual keyphrase generation](https://proceedings.neurips.cc/paper_files/paper/2022/hash/f88709551258331f9ab31b33c71021a4-Abstract-Datasets_and_Benchmarks.html)
- **Github:** <https://github.com/smolPixel/French-keyphrase-generation>
## Dataset Description
### Dataset Summary
The datasets are derived from Papyrus, a repository at Université de Montréal containing various types of documents, mainly theses with abstracts in multiple languages, primarily French and English. The entries are provided in four different configurations based on the languages of abstracts, allowing for generating keyphrases in French, English, or multiple languages.
- **Papyrus-f:** From the French abstracts, generate French keyphrases.
- **Papyrus-e:** From the English abstracts, generate English keyphrases.
- **Papyrus-m:** From one abstract in any language, generate keyphrases in that same
language (one language to one language).
- **Papyrus-a:** From the multiple abstracts of a document, generate keyphrases in the
same languages as the abstracts (many to many languages).
### Languages
- **Main languages:** English, French
- **Others:** Spanish, German, Italian, Portuguese, Arabic, Tagalog, Catalan, Greek, Turkish, Russian, Polish, Farsi, Indonesian, Lingala, Swedish, Finnish, Romanian, Korean
## Dataset Structure
### Dataset content
| Config | Train set size | Valid. set size | Test set size |
| --------- | -------------- | --------------- | ------------- |
| papyrus-m | 20963 | 3040 | 6061 |
| papyrus-e | 10508 | 1539 | 3046 |
| papyrus-f | 10299 | 1488 | 2981 |
| papyrus-a | 11290 | 1638 | 3261 |
### Data fields
- **doc_id:** a unique id for the original document.
- **title:** title of the thesis or article (the language of the title does not always match the language of the abstract/keyphrases).
- **input_text:** abstract of the document.
- **keyphrases:** associated keyphrases.
- **lang:** language of the abstract/keyphrases.
## Citation
@inproceedings{NEURIPS2022_f8870955,
author = {Piedboeuf, Fr\'{e}d\'{e}ric and Langlais, Philippe},
booktitle = {Advances in Neural Information Processing Systems},
editor = {S. Koyejo and S. Mohamed and A. Agarwal and D. Belgrave and K. Cho and A. Oh},
pages = {38046--38059},
publisher = {Curran Associates, Inc.},
title = {A new dataset for multilingual keyphrase generation},
url = {https://proceedings.neurips.cc/paper_files/paper/2022/file/f88709551258331f9ab31b33c71021a4-Paper-Datasets_and_Benchmarks.pdf},
volume = {35},
year = {2022}
} |
adithya7/background-summaries | ---
language:
- en
license: cc-by-nc-4.0
tags:
- summarization
- event-summarization
- background-summarization
annotations_creators:
- expert-generated
language_creators:
- expert-generated
pretty_name: Background Summarization
size_categories:
- 1K<n<10K
source_datasets:
- Timeline17
- Crisis
- SocialTimeline
task_categories:
- summarization
---
# Dataset Card for Background Summarization of Event Timelines
This dataset provides background text summaries for news events timelines.
## Dataset Details
### Dataset Description
Generating concise summaries of news events is a challenging natural language processing task. While journalists often curate timelines to highlight key sub-events, newcomers to a news event face challenges in catching up on its historical context. This dataset addresses this need by introducing the task of background news summarization, which complements each timeline update with a background summary of relevant preceding events. This dataset includes human-annotated backgrounds for 14 major news events from 2005--2014.
- **Curated by:** Adithya Pratapa, Kevin Small, Markus Dreyer
- **Language(s) (NLP):** English
- **License:** CC-BY-NC-4.0
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/amazon-science/background-summaries
- **Paper:** https://arxiv.org/abs/2310.16197
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset can be used for training text summarization systems. The trained systems would be capable of generating background (historical context) to a news update. To generate the background, the system takes past news updates as input.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
Systems trained on this dataset might not perform as expected on domains other than newswire. To avoid factual errors, system-generated summaries should be verified by experts before deploying in real-world.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
### Dataset Fields
| Field | Name | Description |
| :--- | :--- | :--- |
| src | Source | Concatenated string of all the previous updates. Each update text includes the publication date. |
| z | Guidance | Update text for the current timestep. |
| tgt | Target | Background text for the current timestep. |
### Data Splits
An overview of the major events and their splits in this dataset. The last column provides the statistics for background annotations provided in this dataset.
| Split | Major event | Sources (# timelines) | Time period | # updates | len(updates) | len(background) |
| :--- | :--- | ---: | ---: | ---: | ---: | ---: |
| Train | Swine flu | T17 (3) | 2009 | 21 | 52 | 45 |
| Train | Financial crisis | T17 (1) | 2008 | 65 | 115 | 147 |
| Train | Iraq war | T17 (1) | 2005 | 155 | 41 | 162 |
| Validation | Haitian earthquake | T17 (1) | 2010 | 11 | 100 | 61 |
| Validation | Michael Jackson death | T17 (1) | 2009--2011 | 37 | 36 | 164 |
| Validation | BP oil spill | T17 (5) | 2010--2012 | 118 | 56 | 219 |
| Test | NSA leak | SocialTimeline (1) | 2014 | 29 | 45 | 50 |
| Test | Gaza conflict | SocialTimeline (1) | 2014 | 38 | 183 | 263 |
| Test | MH370 flight disappearance | SocialTimeline (1) | 2014 | 39 | 39 | 127 |
| Test | Yemen crisis | Crisis (6) | 2011--2012 | 81 | 30 | 125 |
| Test | Russian-Ukraine conflict | SocialTimeline (3) | 2014 | 86 | 112 | 236 |
| Test | Libyan crisis | T17 (2); Crisis (7) | 2011 | 118 | 38 | 177 |
| Test | Egyptian crisis | T17 (1); Crisis (4) | 2011--2013 | 129 | 34 | 187 |
| Test | Syrian crisis | T17 (4); Crisis (5) | 2011--2013 | 164 | 30 | 162 |
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Readers often find it difficult to keep track of complex news events. A background summary that provides sufficient historical context can help improve the reader's understanding of a news update. This dataset provides human-annotated backgrounds for development and evaluation of background summarization systems.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
This dataset is built upon three popular news timeline summarization datasets, Timeline17 ([Binh Tran et al., 2013](https://dl.acm.org/doi/10.1145/2487788.2487829)), Crisis ([Tran et al., 2015](https://link.springer.com/chapter/10.1007/978-3-319-16354-3_26)), and Social Timeline ([Wang et al., 2015](https://aclanthology.org/N15-1112/)).
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
__Timeline17:__ compiled from an ensemble of news websites, this dataset provides 17 timelines spanning 9 major events from 2005--2013.
__Crisis:__ a follow-up to the Timeline17 dataset, this covers 25 timelines spanning 4 major events. While it mostly covers a subset of events from Timeline17, it adds a new event (the Yemen crisis).
__Social Timeline:__ compiled 6 timelines covering 4 major events from 2014. The timelines were collected from Wikipedia, NYTimes, and BBC.
### Annotations
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
Timelines were originally collected from various news websites (CNN, BBC, NYTimes, etc.), many events have more than one timeline. Since each timeline covers the same underlying event, we merge them using timestamps to create a single timeline per event. During this merging process, we often end up with more than one update text per timestamp with possibly duplicate content. We ask the annotators to first rewrite the input updates to remove any duplicate content. Our annotation process for each news event contains the following three steps:
1. Read the input timeline to get a high-level understanding of the event.
2. For each timestep, read the provided 'rough' update summary. Rewrite the update into a short paragraph, removing any duplicate or previously reported subevents.
3. Go through the timeline in a sequential manner and write a background summary for each timestep.
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
We hired three professional annotators. For each timeline, we collect three independent (rewritten) update and (new) background pairs.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
To the best of our knowledge, there is no personal or sensitive information in this dataset.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
### Limitations
__Personalized Backgrounds:__ While a background summary can be useful to any news reader, the utility can vary depending on the reader's familiarity with the event. This dataset doesn't include any backgrounds customized to individual readers.
__Local Events:__ This dataset is limited to globally popular events involving disasters and conflicts. We leave the task of collecting background summaries for local events to future work.
__Background from News Articles:__ Background summaries can also be generated directly from news articles. In this dataset, we only consider background summaries based on past news updates. We leave the extension to news articles to future work.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
__BibTeX:__
```bibtex
@article{pratapa-etal-2023-background,
title = {Background Summarization of Event Timelines},
author = {Pratapa, Adithya and Small, Kevin and Dreyer, Markus},
publisher = {EMNLP},
year = {2023},
url = {https://arxiv.org/abs/2310.16197},
}
```
## Glossary
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
__Major event:__ the key news story for which we are constructing a timeline. For instance, 'Egyptian Crisis', 'BP oil spill', 'MH 370 disappearance' are some of the super events from our dataset.
__Timeline:__ a series of timesteps. Each timestep in a timeline is associated with an update and a background summary.
__Timestep:__ day of the event (`yyyy-mm-dd`).
__Update:__ a short text summary of _what's new_ in the news story. This text summarizes the latest events, specifically ones that are important to the overall story.
__Background:__ a short text summary that provides _sufficient historical context_ for the current update. Background aims to provide the reader a quick history of the news story, without them having to read all the previous updates. Background should cover past events that help in understanding the current events described in the update.
## Dataset Card Authors
Adithya Pratapa, Kevin Small, Markus Dreyer
## Dataset Card Contact
[Adithya Pratapa](https://apratapa.xyz)
|
shrikant11/myra2 | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 69548172.35
num_examples: 1385
download_size: 65595329
dataset_size: 69548172.35
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
statworx/swiss-dialects | ---
annotations_creators: []
language:
- ch
language_creators:
- found
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: ArchiMob Corpus
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- dialect
task_categories:
- text-generation
- text-classification
task_ids:
- language-modeling
---
# Dataset Card for ArchiMod Corpus
## Dataset Description
- **Homepage:** https://wortschatz.uni-leipzig.de/en/download/Swiss%20German
- **Repository:** https://huggingface.co/datasets/statworx/leipzip-swiss
### Dataset Summary
The ArchiMob corpus represents German linguistic varieties spoken within the territory of Switzerland. This corpus is the first electronic resource containing long samples of transcribed text in Swiss German, intended for studying the spatial distribution of morphosyntactic features and for natural language processing.
### Languages
Swiss-German
## Dataset Structure
### Data Instances
``
{
'sentence': Sentence in Swiss-German,
'label': Dialect as category
}
``
### Data Fields
`sentence`: Text as string.
`label`: Label as string.
### Data Splits
[More Information Needed]
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
https://www.spur.uzh.ch/en/departments/research/textgroup/ArchiMob.html
## Additional Information
### Licensing Information
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License
### Citation Information
Scherrer, Y., T. Samardžić, E. Glaser (2019). "Digitising Swiss German -- How to process and study a polycentric spoken language". Language Resources and Evaluation. (First online)
Scherrer, Y., T. Samardžić, E. Glaser (2019). "ArchiMob: Ein multidialektales Korpus schweizerdeutscher Spontansprache". Linguistik Online, 98(5), 425-454. https://doi.org/10.13092/lo.98.5947
|
gargsahil713repo/weather-data | ---
license: apache-2.0
---
|
Santiagodf19/Deliverde-cqav2 | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 80198
num_examples: 18
- name: test
num_bytes: 27388
num_examples: 6
download_size: 20133
dataset_size: 107586
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
lmms-lab/llava-bench-in-the-wild | ---
dataset_info:
features:
- name: question_id
dtype: int8
- name: question
dtype: string
- name: image
dtype: image
- name: caption
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
- name: image_id
dtype: string
splits:
- name: train
num_bytes: 22333678.0
num_examples: 60
download_size: 9773451
dataset_size: 22333678.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [LLaVA-Bench(wild)](https://llava-vl.github.io/) that is used in LLaVA. It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@misc{liu2023improvedllava,
author={Liu, Haotian and Li, Chunyuan and Li, Yuheng and Lee, Yong Jae},
title={Improved Baselines with Visual Instruction Tuning},
publisher={arXiv:2310.03744},
year={2023},
}
@inproceedings{liu2023llava,
author = {Liu, Haotian and Li, Chunyuan and Wu, Qingyang and Lee, Yong Jae},
title = {Visual Instruction Tuning},
booktitle = {NeurIPS},
year = {2023}
}
```
|
TLeonidas/twitter-hate-speech-en-240ksamples | ---
license: apache-2.0
---
This dataset is a combination of the three datasets listed below:
1. [tdavidson/hate_speech_offensive](https://huggingface.co/datasets/tdavidson/hate_speech_offensive)
2. [LennardZuendorf/Dynamically-Generated-Hate-Speech-Dataset](https://huggingface.co/datasets/LennardZuendorf/Dynamically-Generated-Hate-Speech-Dataset)
3. [ucberkeley-dlab/measuring-hate-speech](https://huggingface.co/datasets/ucberkeley-dlab/measuring-hate-speech)
It has only two columns, "tweet" and "labels", and 242738 rows of uncleaned data.
|
HuggingFaceM4/ActivitiyNet_Captions | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: ActivityNet Captions
size_categories:
- 10k<n<100K
source_datasets:
- original
task_categories:
- video-captionning
task_ids:
- closed-domain-qa
---
# Dataset Card for ActivityNet Captions
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://cs.stanford.edu/people/ranjaykrishna/densevid/
- **Paper:** https://arxiv.org/abs/1705.00754
### Dataset Summary
The ActivityNet Captions dataset connects videos to a series of temporally annotated sentence descriptions. Each sentence covers an unique segment of the video, describing multiple events that occur. These events may occur over very long or short periods of time and are not limited in any capacity, allowing them to co-occur. On average, each of the 20k videos contains 3.65 temporally localized sentences, resulting in a total of 100k sentences. We find that the number of sentences per video follows a relatively normal distribution. Furthermore, as the video duration increases, the number of sentences also increases. Each sentence has an average length of 13.48 words, which is also normally distributed. You can find more details of the dataset under the ActivityNet Captions Dataset section, and under supplementary materials in the paper.
### Languages
The captions in the dataset are in English.
## Dataset Structure
### Data Fields
- `video_id` : `str` unique identifier for the video
- `video_path`: `str` Path to the video file
-`duration`: `float32` Duration of the video
- `captions_starts`: `List_float32` List of timestamps denoting the time at which each caption starts
- `captions_ends`: `List_float32` List of timestamps denoting the time at which each caption ends
- `en_captions`: `list_str` List of english captions describing parts of the video
### Data Splits
| |train |validation| test | Overall |
|-------------|------:|---------:|------:|------:|
|# of videos|10,009 |4,917 |4,885 |19,811 |
### Annotations
Quoting [ActivityNet Captions' paper](https://arxiv.org/abs/1705.00754): \
"Each annotation task was divided into two steps: (1)
Writing a paragraph describing all major events happening
in the videos in a paragraph, with each sentence of the paragraph describing one event, and (2) Labeling the
start and end time in the video in which each sentence in the
paragraph event occurred."
### Who annotated the dataset?
Amazon Mechnical Turk annotators
### Personal and Sensitive Information
Nothing specifically mentioned in the paper.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@InProceedings{tgif-cvpr2016,
@inproceedings{krishna2017dense,
title={Dense-Captioning Events in Videos},
author={Krishna, Ranjay and Hata, Kenji and Ren, Frederic and Fei-Fei, Li and Niebles, Juan Carlos},
booktitle={International Conference on Computer Vision (ICCV)},
year={2017}
}
```
### Contributions
Thanks to [@leot13](https://github.com/leot13) for adding this dataset. |
smrynrz20/medical_quad | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 42807417
num_examples: 96330
- name: test
num_bytes: 5080536
num_examples: 12041
- name: validation
num_bytes: 5462819
num_examples: 12042
download_size: 57333451
dataset_size: 53350772
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
HuggingFaceH4/surge-pm-pilot | ---
license: apache-2.0
---
Pilot annotations for PM dataset that will be used for RLHF. The dataset used outputs from opensource models (https://huggingface.co/spaces/HuggingFaceH4/instruction-models-outputs) on a mix on Anthropic hh-rlhf (https://huggingface.co/datasets/HuggingFaceH4/hh-rlhf) dataset and Self-Instruct's seed (https://huggingface.co/datasets/HuggingFaceH4/self-instruct-seed) dataset. |
apsr/col783_dataset | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 27683296.0
num_examples: 296
download_size: 27677426
dataset_size: 27683296.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-prehistory-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6833
num_examples: 5
- name: test
num_bytes: 1008140
num_examples: 324
download_size: 14881
dataset_size: 1014973
---
# Dataset Card for "mmlu-prehistory-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-emotion-default-139135-14996090 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: bhadresh-savani/roberta-base-emotion
metrics: ['roc_auc', 'mae']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: bhadresh-savani/roberta-base-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@gmoney](https://huggingface.co/gmoney) for evaluating this model. |
SEACrowd/nllb_seed | ---
tags:
- machine-translation
language:
- ace
- bjn
- bug
- eng
---
# nllb_seed
No Language Left Behind Seed Data
NLLB Seed is a set of professionally-translated sentences in the Wikipedia domain. Data for NLLB-Seed was sampled from Wikimedia’s List of articles every Wikipedia should have, a collection of topics in different fields of knowledge and human activity. NLLB-Seed consists of around six thousand sentences in 39 languages. NLLB-Seed is meant to be used for training rather than model evaluation. Due to this difference, NLLB-Seed does not go through the human quality assurance process present in FLORES-200.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@article{nllb2022,
author = {NLLB Team, Marta R. Costa-jussà, James Cross, Onur Çelebi, Maha Elbayad, Kenneth Heafield, Kevin Heffernan, Elahe Kalbassi, Janice Lam, Daniel Licht, Jean Maillard, Anna Sun, Skyler Wang, Guillaume Wenzek, Al Youngblood, Bapi Akula, Loic Barrault, Gabriel Mejia Gonzalez, Prangthip Hansanti, John Hoffman, Semarley Jarrett, Kaushik Ram Sadagopan, Dirk Rowe, Shannon Spruit, Chau Tran, Pierre Andrews, Necip Fazil Ayan, Shruti Bhosale, Sergey Edunov, Angela Fan, Cynthia Gao, Vedanuj Goswami, Francisco Guzmán, Philipp Koehn, Alexandre Mourachko, Christophe Ropers, Safiyyah Saleem, Holger Schwenk, Jeff Wang},
title = {No Language Left Behind: Scaling Human-Centered Machine Translation},
year = {2022}
}
```
## License
CC-BY-SA 4.0
## Homepage
[https://github.com/facebookresearch/flores/tree/main/nllb_seed](https://github.com/facebookresearch/flores/tree/main/nllb_seed)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
rubentito/OCR-IDL | ---
license: wtfpl
---
annotations_creators:
- machine-generated
language:
- en
language_creators: []
license:
- wtfpl
multilinguality:
- monolingual
pretty_name: OCR-IDL
size_categories:
- 10M<n<100M
source_datasets:
- original
tags:
- pretraining
- documents
- idl
- ''
task_categories: []
task_ids: []
|
TurkuNLP/many_emotions_finnish | ---
task_categories:
- text-classification
language:
- fi
size_categories:
- 100K<n<1M
---
### Source:
This dataset is a machine translated version of the Many Emotions dataset available here: https://huggingface.co/datasets/ma2za/many_emotions <br>
It was translated into Finnish using DeepL: https://www.deepl.com/translator <br>
The Many Emotions dataset itself is a combination of three other emotion annotated datasets. These datasets are:
- Daily Dialog: https://huggingface.co/datasets/daily_dialog
- GoEmotions: https://huggingface.co/datasets/go_emotions
- Emotion: https://huggingface.co/datasets/dair-ai/emotion
Please, cite the original sources if you use this dataset.
### Data fields:
- `id`: unique identifier
- `text`: a `string` feature.
- `label`: a classification label, with possible values including `anger` (0), `fear` (1), `joy` (2), `love` (3), `sadness` (4), `surprise` (5), `neutral` (6).
- `dataset`: source dataset
- `license`: inherited license from source dataset
|
tyzhu/find_marker_both_sent_train_100_eval_40_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1175807
num_examples: 644
- name: validation
num_bytes: 376070
num_examples: 202
download_size: 270356
dataset_size: 1551877
---
# Dataset Card for "find_marker_both_sent_train_100_eval_40_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MU-Kindai/datasets-for-JCSE | ---
license: mit
---
|
AdaptLLM/FiQA_SA | ---
configs:
- config_name: FiQA_SA
data_files:
- split: train
path: train.csv
- split: test
path: test.csv
task_categories:
- text-classification
- question-answering
- zero-shot-classification
language:
- en
tags:
- finance
---
# Domain Adaptation of Large Language Models
This repo contains the **FiQA_SA dataset** used in our **ICLR 2024** paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### 🤗 We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! 🤗
**************************** **Updates** ****************************
* 2024/4/2: Released the raw data splits (train and test) of all the evaluation datasets
* 2024/1/16: 🎉 Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024!!!🎉
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B.
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B.
## Domain-Specific LLaMA-1
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
## Domain-Specific Tasks
### Pre-templatized/Formatted Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
**Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages:
- [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt)
- [RCT](https://huggingface.co/datasets/AdaptLLM/RCT)
- [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA)
- [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA)
- [Headline](https://huggingface.co/datasets/AdaptLLM/Headline)
- [NER](https://huggingface.co/datasets/AdaptLLM/NER)
- [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
The other datasets used in our paper have already been available in huggingface, and you can directly load them with the following code:
```python
from datasets import load_dataset
# MQP:
dataset = load_dataset('medical_questions_pairs')
# PubmedQA:
dataset = load_dataset('bigbio/pubmed_qa')
# USMLE:
dataset=load_dataset('GBaker/MedQA-USMLE-4-options')
# SCOTUS
dataset = load_dataset("lex_glue", 'scotus')
# CaseHOLD
dataset = load_dataset("lex_glue", 'case_hold')
# UNFAIR-ToS
dataset = load_dataset("lex_glue", 'unfair_tos')
```
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
```
and the original dataset:
```bibtex
@inproceedings{FiQA-SA,
author = {Macedo Maia and
Siegfried Handschuh and
Andr{\'{e}} Freitas and
Brian Davis and
Ross McDermott and
Manel Zarrouk and
Alexandra Balahur},
title = {WWW'18 Open Challenge: Financial Opinion Mining and Question Answering},
booktitle = {{WWW} (Companion Volume)},
pages = {1941--1942},
publisher = {{ACM}},
year = {2018}
}
``` |
felipeoes/br_federal_legislation_qa | ---
dataset_info:
features:
- name: year
dtype: int64
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated_content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3585448030
num_examples: 128180
download_size: 1300170370
dataset_size: 3585448030
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingface/autotrain-data-my-project736765 | Invalid username or password. |
HuggingFaceM4/imagenet1k_support_5k_query_sets_part_5 | Invalid username or password. |
jaese/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: comments
dtype: int64
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: author_association
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: closed_at
dtype: string
- name: state_reason
dtype: string
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 16723706
num_examples: 5524
download_size: 3919619
dataset_size: 16723706
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadupstack-gaming-qrels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 60520
num_examples: 2263
download_size: 0
dataset_size: 60520
---
# Dataset Card for "cqadupstack-gaming-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/gov2_trec-tb-2005_named-page | ---
pretty_name: '`gov2/trec-tb-2005/named-page`'
viewer: false
source_datasets: ['irds/gov2']
task_categories:
- text-retrieval
---
# Dataset Card for `gov2/trec-tb-2005/named-page`
The `gov2/trec-tb-2005/named-page` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/gov2#gov2/trec-tb-2005/named-page).
# Data
This dataset provides:
- `queries` (i.e., topics); count=252
- `qrels`: (relevance assessments); count=11,729
- For `docs`, use [`irds/gov2`](https://huggingface.co/datasets/irds/gov2)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/gov2_trec-tb-2005_named-page', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/gov2_trec-tb-2005_named-page', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Clarke2005TrecTerabyte,
title={The TREC 2005 Terabyte Track},
author={Charles L. A. Clark and Falk Scholer and Ian Soboroff},
booktitle={TREC},
year={2005}
}
```
|
gsgoncalves/roberta_pretrain | ---
license: unknown
task_categories:
- fill-mask
- text-generation
language:
- en
pretty_name: RoBERTa Pretrain Dataset
size_categories:
- 10M<n<100M
---
# Dataset Card for RoBERTa Pretrain
### Dataset Summary
This is the concatenation of the datasets used to Pretrain RoBERTa.
The dataset is not shuffled and contains raw text. It is packaged for convenicence.
Essentially is the same as:
```
from datasets import load_dataset, concatenate_datasets
bookcorpus = load_dataset("bookcorpus", split="train")
openweb = load_dataset("openwebtext", split="train")
cc_news = load_dataset("cc_news", split="train")
cc_news = cc_news.remove_columns([col for col in cc_news.column_names if col != "text"])
cc_stories = load_dataset("spacemanidol/cc-stories", split="train")
return concatenate_datasets([bookcorpus, openweb, cc_news, cc_stories['train']])
``` |
james-burton/jigsaw_unintended_bias100K_ordinal | ---
dataset_info:
features:
- name: comment_text
dtype: string
- name: asian
dtype: float64
- name: atheist
dtype: float64
- name: bisexual
dtype: float64
- name: black
dtype: float64
- name: buddhist
dtype: float64
- name: christian
dtype: float64
- name: female
dtype: float64
- name: heterosexual
dtype: float64
- name: hindu
dtype: float64
- name: homosexual_gay_or_lesbian
dtype: float64
- name: intellectual_or_learning_disability
dtype: float64
- name: jewish
dtype: float64
- name: latino
dtype: float64
- name: male
dtype: float64
- name: muslim
dtype: float64
- name: other_disability
dtype: float64
- name: other_gender
dtype: float64
- name: other_race_or_ethnicity
dtype: float64
- name: other_religion
dtype: float64
- name: other_sexual_orientation
dtype: float64
- name: physical_disability
dtype: float64
- name: psychiatric_or_mental_illness
dtype: float64
- name: transgender
dtype: float64
- name: white
dtype: float64
- name: funny
dtype: int64
- name: wow
dtype: int64
- name: sad
dtype: int64
- name: likes
dtype: int64
- name: disagree
dtype: int64
- name: target
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 46984927
num_examples: 85000
- name: validation
num_bytes: 8285559
num_examples: 15000
- name: test
num_bytes: 13825536
num_examples: 25000
download_size: 0
dataset_size: 69096022
---
# Dataset Card for "jigsaw_unintended_bias100K_ordinal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_twodgirl__Nimue-7B | ---
pretty_name: Evaluation run of twodgirl/Nimue-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [twodgirl/Nimue-7B](https://huggingface.co/twodgirl/Nimue-7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_twodgirl__Nimue-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T20:51:19.081455](https://huggingface.co/datasets/open-llm-leaderboard/details_twodgirl__Nimue-7B/blob/main/results_2024-03-02T20-51-19.081455.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6482366584692896,\n\
\ \"acc_stderr\": 0.032115374994645604,\n \"acc_norm\": 0.6492894651664455,\n\
\ \"acc_norm_stderr\": 0.03278199572773934,\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.5088711574256769,\n\
\ \"mc2_stderr\": 0.015340220211921846\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n\
\ \"acc_stderr\": 0.00486016207633098,\n \"acc_norm\": 0.8274248157737503,\n\
\ \"acc_norm_stderr\": 0.0037710731802147288\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276878,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276878\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092382,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407004,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407004\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223684,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223684\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.01271845661870177,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.01271845661870177\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072766,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072766\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.5088711574256769,\n\
\ \"mc2_stderr\": 0.015340220211921846\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828077\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6679302501895376,\n \
\ \"acc_stderr\": 0.01297246503436187\n }\n}\n```"
repo_url: https://huggingface.co/twodgirl/Nimue-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|arc:challenge|25_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|gsm8k|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hellaswag|10_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-51-19.081455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T20-51-19.081455.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- '**/details_harness|winogrande|5_2024-03-02T20-51-19.081455.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T20-51-19.081455.parquet'
- config_name: results
data_files:
- split: 2024_03_02T20_51_19.081455
path:
- results_2024-03-02T20-51-19.081455.parquet
- split: latest
path:
- results_2024-03-02T20-51-19.081455.parquet
---
# Dataset Card for Evaluation run of twodgirl/Nimue-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [twodgirl/Nimue-7B](https://huggingface.co/twodgirl/Nimue-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_twodgirl__Nimue-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T20:51:19.081455](https://huggingface.co/datasets/open-llm-leaderboard/details_twodgirl__Nimue-7B/blob/main/results_2024-03-02T20-51-19.081455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6482366584692896,
"acc_stderr": 0.032115374994645604,
"acc_norm": 0.6492894651664455,
"acc_norm_stderr": 0.03278199572773934,
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012055,
"mc2": 0.5088711574256769,
"mc2_stderr": 0.015340220211921846
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955009
},
"harness|hellaswag|10": {
"acc": 0.6132244572794264,
"acc_stderr": 0.00486016207633098,
"acc_norm": 0.8274248157737503,
"acc_norm_stderr": 0.0037710731802147288
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276878,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276878
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092382,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407004,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407004
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223684,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223684
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.01271845661870177,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.01271845661870177
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072766,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072766
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012055,
"mc2": 0.5088711574256769,
"mc2_stderr": 0.015340220211921846
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828077
},
"harness|gsm8k|5": {
"acc": 0.6679302501895376,
"acc_stderr": 0.01297246503436187
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-emotion-c4654930-10865460 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: bhadresh-savani/electra-base-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: bhadresh-savani/electra-base-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@bhadresh-savani](https://huggingface.co/bhadresh-savani) for evaluating this model. |
sergiopaniego/CarlaFollowLanePreviousV | ---
license: apache-2.0
---
# CarlaFollowLanePreviousV
This dataset contains images extracted from CARLA simulator using an expert agent to perform imitation learning.
The expert agent is the autopilot of CARLA 0.9.12. We store images corresponding to the bird-eye-view of the camera and the corresponding control commands generated
by the agent, including the previous speed.
## Dataset details
| Folder | Scenario | Examples number |
| ------------------------------- | ---------------------- | --------------------------------- |
| carla_dataset_16_11_clockwise_town_01_extreme_2 | Town01 | 1582 |
| carla_dataset_24_07_anticlockwise_town_01_extreme | Town01 | 4957 |
| carla_dataset_test_04_11_clockwise_town_01_previous_v_extreme | Town01 | 1911 |
| carla_dataset_test_31_10_anticlockwise_town_01_previous_v | Town01 | 6184 |
| carla_dataset_test_31_10_clockwise_town_01_previous_v | Town01 | 6056 |
| carla_dataset_test_04_11_anticlockwise_town_03_previous_v | Town03 | 7285 |
| carla_dataset_test_04_11_clockwise_town_03_previous_v | Town03 | 5487 |
| carla_dataset_test_04_11_anticlockwise_town_05_previous_v | Town05 | 10375 |
| carla_dataset_test_04_11_clockwise_town_05_previous_v | Town05 | 12094 |
| carla_dataset_test_04_11_anticlockwise_town_07_previous_v | Town07 | 1781 |
| carla_dataset_test_04_11_clockwise_town_07_previous_v | Town07 | 1930 |
|
jonathan-roberts1/WHU-RS19 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': airport
'1': beach
'2': bridge
'3': commercial
'4': desert
'5': farmland
'6': football field
'7': forest
'8': industrial
'9': meadow
'10': mountain
'11': park
'12': parking
'13': pond
'14': port
'15': railway station
'16': residential
'17': river
'18': viaduct
splits:
- name: train
num_bytes: 115362308.8
num_examples: 1005
download_size: 113327264
dataset_size: 115362308.8
license: cc-by-4.0
---
# Dataset Card for "WHU-RS19"
## Dataset Description
- **Paper:** [Structural high-resolution satellite image indexing](https://hal.science/hal-00458685/document)
- **Paper:** [Satellite image classification via two-layer sparse coding with biased image representation](https://ieeexplore.ieee.org/iel5/8859/4357975/05545358.pdf)
### Licensing Information
Public Domain
## Citation Information
[Structural high-resolution satellite image indexing](https://hal.science/hal-00458685/document)
[Satellite image classification via two-layer sparse coding with biased image representation](https://ieeexplore.ieee.org/iel5/8859/4357975/05545358.pdf)
```
@article{xia2009structural,
title={Structural high-resolution satellite image indexing},
author={Xia, Gui-Song and Yang, Wen and Delon, Julie and Gousseau, Yann and Sun, Hong and Ma{\^\i}tre, Henri},
year={2009}
}
@article{dai2010satellite,
title={Satellite image classification via two-layer sparse coding with biased image representation},
author={Dai, Dengxin and Yang, Wen},
journal={IEEE Geoscience and remote sensing letters},
volume={8},
number={1},
pages={173--176},
year={2010},
publisher={IEEE}
}
``` |
Bytebuyruklandirici/Updllset | ---
license: apache-2.0
---
|
arazd/tulu_gpt4_alpaca | ---
license: openrail
---
|
open-llm-leaderboard/details_Riiid__sheep-duck-llama-2-70b-v1.1 | ---
pretty_name: Evaluation run of Riiid/sheep-duck-llama-2-70b-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Riiid/sheep-duck-llama-2-70b-v1.1](https://huggingface.co/Riiid/sheep-duck-llama-2-70b-v1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Riiid__sheep-duck-llama-2-70b-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T22:48:41.234684](https://huggingface.co/datasets/open-llm-leaderboard/details_Riiid__sheep-duck-llama-2-70b-v1.1/blob/main/results_2023-12-09T22-48-41.234684.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7053343320815155,\n\
\ \"acc_stderr\": 0.030260160101824644,\n \"acc_norm\": 0.7109334613998801,\n\
\ \"acc_norm_stderr\": 0.03084136530304881,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6454578975757769,\n\
\ \"mc2_stderr\": 0.014741040304266572\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623501,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710688\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.691396136227843,\n\
\ \"acc_stderr\": 0.004609731925736905,\n \"acc_norm\": 0.8777136028679546,\n\
\ \"acc_norm_stderr\": 0.0032694673590543157\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.026880647889051985,\n\
\ \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.026880647889051985\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802267,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802267\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.02571523981134676,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.02571523981134676\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172523,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172523\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7689075630252101,\n \"acc_stderr\": 0.027381406927868876,\n\
\ \"acc_norm\": 0.7689075630252101,\n \"acc_norm_stderr\": 0.027381406927868876\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8990825688073395,\n \"acc_stderr\": 0.012914673545364408,\n \"\
acc_norm\": 0.8990825688073395,\n \"acc_norm_stderr\": 0.012914673545364408\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878467,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878467\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.0321782942074463,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.0321782942074463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.0283116014414386,\n\
\ \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.0283116014414386\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.01745698787243618,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.01745698787243618\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n\
\ \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n\
\ \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967554,\n\
\ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6346368715083799,\n\
\ \"acc_stderr\": 0.0161048338801423,\n \"acc_norm\": 0.6346368715083799,\n\
\ \"acc_norm_stderr\": 0.0161048338801423\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398205,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157375,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5886524822695035,\n \"acc_stderr\": 0.029354911159940968,\n \
\ \"acc_norm\": 0.5886524822695035,\n \"acc_norm_stderr\": 0.029354911159940968\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5880052151238592,\n\
\ \"acc_stderr\": 0.012570871032146064,\n \"acc_norm\": 0.5880052151238592,\n\
\ \"acc_norm_stderr\": 0.012570871032146064\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02679956202488766,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02679956202488766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7663398692810458,\n \"acc_stderr\": 0.017119158496044506,\n \
\ \"acc_norm\": 0.7663398692810458,\n \"acc_norm_stderr\": 0.017119158496044506\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.025801283475090496,\n\
\ \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.025801283475090496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6454578975757769,\n\
\ \"mc2_stderr\": 0.014741040304266572\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838897\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4799090219863533,\n \
\ \"acc_stderr\": 0.013761361772989008\n }\n}\n```"
repo_url: https://huggingface.co/Riiid/sheep-duck-llama-2-70b-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|arc:challenge|25_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|gsm8k|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hellaswag|10_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-20-47.805368.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T22-48-41.234684.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-20-47.805368.parquet'
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T22-48-41.234684.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T22_48_41.234684
path:
- '**/details_harness|winogrande|5_2023-12-09T22-48-41.234684.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T22-48-41.234684.parquet'
- config_name: results
data_files:
- split: 2023_10_04T07_20_47.805368
path:
- results_2023-10-04T07-20-47.805368.parquet
- split: 2023_12_09T22_48_41.234684
path:
- results_2023-12-09T22-48-41.234684.parquet
- split: latest
path:
- results_2023-12-09T22-48-41.234684.parquet
---
# Dataset Card for Evaluation run of Riiid/sheep-duck-llama-2-70b-v1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Riiid/sheep-duck-llama-2-70b-v1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Riiid/sheep-duck-llama-2-70b-v1.1](https://huggingface.co/Riiid/sheep-duck-llama-2-70b-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Riiid__sheep-duck-llama-2-70b-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T22:48:41.234684](https://huggingface.co/datasets/open-llm-leaderboard/details_Riiid__sheep-duck-llama-2-70b-v1.1/blob/main/results_2023-12-09T22-48-41.234684.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7053343320815155,
"acc_stderr": 0.030260160101824644,
"acc_norm": 0.7109334613998801,
"acc_norm_stderr": 0.03084136530304881,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6454578975757769,
"mc2_stderr": 0.014741040304266572
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623501,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710688
},
"harness|hellaswag|10": {
"acc": 0.691396136227843,
"acc_stderr": 0.004609731925736905,
"acc_norm": 0.8777136028679546,
"acc_norm_stderr": 0.0032694673590543157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.026880647889051985,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.026880647889051985
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802267,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802267
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.02571523981134676,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.02571523981134676
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172523,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172523
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7689075630252101,
"acc_stderr": 0.027381406927868876,
"acc_norm": 0.7689075630252101,
"acc_norm_stderr": 0.027381406927868876
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8990825688073395,
"acc_stderr": 0.012914673545364408,
"acc_norm": 0.8990825688073395,
"acc_norm_stderr": 0.012914673545364408
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878467,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878467
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.0321782942074463,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.0321782942074463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.0283116014414386,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.0283116014414386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.01745698787243618,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.01745698787243618
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967554,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6346368715083799,
"acc_stderr": 0.0161048338801423,
"acc_norm": 0.6346368715083799,
"acc_norm_stderr": 0.0161048338801423
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398205,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157375,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5886524822695035,
"acc_stderr": 0.029354911159940968,
"acc_norm": 0.5886524822695035,
"acc_norm_stderr": 0.029354911159940968
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5880052151238592,
"acc_stderr": 0.012570871032146064,
"acc_norm": 0.5880052151238592,
"acc_norm_stderr": 0.012570871032146064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02679956202488766,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02679956202488766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7663398692810458,
"acc_stderr": 0.017119158496044506,
"acc_norm": 0.7663398692810458,
"acc_norm_stderr": 0.017119158496044506
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.025801283475090496,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.025801283475090496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6454578975757769,
"mc2_stderr": 0.014741040304266572
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838897
},
"harness|gsm8k|5": {
"acc": 0.4799090219863533,
"acc_stderr": 0.013761361772989008
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liahchan/wnut_train_subset | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 123526.0
num_examples: 350
download_size: 40132
dataset_size: 123526.0
---
# Dataset Card for "wnut_train_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/stack-v2-python-chunk2 | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: directory_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_language
dtype: string
- name: files
list:
- name: blob_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: language
dtype: string
- name: length_bytes
dtype: int64
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: src_encoding
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: alphanum_fraction
dtype: float32
- name: alpha_fraction
dtype: float32
- name: num_lines
dtype: int32
- name: avg_line_length
dtype: float32
- name: max_line_length
dtype: int32
- name: num_files
dtype: int64
splits:
- name: train
num_bytes: 7890229153
num_examples: 2984968
download_size: 5039112278
dataset_size: 7890229153
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics | ---
pretty_name: Evaluation run of behnamsh/gpt2_platypus-camel_physics
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [behnamsh/gpt2_platypus-camel_physics](https://huggingface.co/behnamsh/gpt2_platypus-camel_physics)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T03:21:29.726636](https://huggingface.co/datasets/open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics/blob/main/results_2023-10-28T03-21-29.726636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0032508389261744967,\n\
\ \"em_stderr\": 0.0005829486708558949,\n \"f1\": 0.048523489932885924,\n\
\ \"f1_stderr\": 0.0013787561628243192,\n \"acc\": 0.25769534333070243,\n\
\ \"acc_stderr\": 0.0070229133948918315\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.0032508389261744967,\n \"em_stderr\": 0.0005829486708558949,\n\
\ \"f1\": 0.048523489932885924,\n \"f1_stderr\": 0.0013787561628243192\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5153906866614049,\n\
\ \"acc_stderr\": 0.014045826789783663\n }\n}\n```"
repo_url: https://huggingface.co/behnamsh/gpt2_platypus-camel_physics
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T03_21_29.726636
path:
- '**/details_harness|drop|3_2023-10-28T03-21-29.726636.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T03-21-29.726636.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T03_21_29.726636
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-21-29.726636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-21-29.726636.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-47-32.538128.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-47-32.538128.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T03_21_29.726636
path:
- '**/details_harness|winogrande|5_2023-10-28T03-21-29.726636.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T03-21-29.726636.parquet'
- config_name: results
data_files:
- split: 2023_09_12T04_47_32.538128
path:
- results_2023-09-12T04-47-32.538128.parquet
- split: 2023_10_28T03_21_29.726636
path:
- results_2023-10-28T03-21-29.726636.parquet
- split: latest
path:
- results_2023-10-28T03-21-29.726636.parquet
---
# Dataset Card for Evaluation run of behnamsh/gpt2_platypus-camel_physics
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/behnamsh/gpt2_platypus-camel_physics
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [behnamsh/gpt2_platypus-camel_physics](https://huggingface.co/behnamsh/gpt2_platypus-camel_physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T03:21:29.726636](https://huggingface.co/datasets/open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics/blob/main/results_2023-10-28T03-21-29.726636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0032508389261744967,
"em_stderr": 0.0005829486708558949,
"f1": 0.048523489932885924,
"f1_stderr": 0.0013787561628243192,
"acc": 0.25769534333070243,
"acc_stderr": 0.0070229133948918315
},
"harness|drop|3": {
"em": 0.0032508389261744967,
"em_stderr": 0.0005829486708558949,
"f1": 0.048523489932885924,
"f1_stderr": 0.0013787561628243192
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5153906866614049,
"acc_stderr": 0.014045826789783663
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
JorgeAcevedx/portrait | ---
license: afl-3.0
---
|
KalaiselvanD/kalai-ds-1 | ---
license: apache-2.0
---
|
mteb/neuclir-2023-rus | ---
language:
- rus
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- neuclir
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_examples: 25634
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_examples: 4627543
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_examples: 76
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
---
From the NeuCLIR TREC Track 2023: https://arxiv.org/abs/2304.12367
Generated from https://huggingface.co/datasets/neuclir/neuclir1
```
@article{lawrie2024overview,
title={Overview of the TREC 2023 NeuCLIR Track},
author={Lawrie, Dawn and MacAvaney, Sean and Mayfield, James and McNamee, Paul and Oard, Douglas W and Soldaini, Luca and Yang, Eugene},
url={https://trec.nist.gov/pubs/trec32/papers/Overview_neuclir.pdf},
year={2024}
}
```
|
GranamyrBR/juris | ---
license: mit
---
|
autoevaluate/autoeval-eval-xsum-default-7c65dc-60294145404 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: google/pegasus-xsum
metrics: ['bertscore']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-xsum
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@zuzannad1](https://huggingface.co/zuzannad1) for evaluating this model. |
comet24082002/vie_wiki_dataset | ---
license: apache-2.0
task_categories:
- text-retrieval
language:
- vi
size_categories:
- 1M<n<10M
--- |
orion-research/mmlu | ---
dataset_info:
features:
- name: subject
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 15662484
num_examples: 27023
download_size: 7549815
dataset_size: 15662484
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zjkarina/matreshka | ---
dataset_info:
features:
- name: role
sequence: string
- name: dialog
sequence: string
- name: persona
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 7320311
num_examples: 6655
- name: validation
num_bytes: 1806432
num_examples: 1664
download_size: 4092810
dataset_size: 9126743
language:
- ru
pretty_name: matreshka
size_categories:
- 1K<n<10K
task_categories:
- conversational
- summarization
- text-generation
license: cc-by-4.0
---
# Dataset Card for "matreshka"

(image generated by Kandinsky-2.1 neural network)
Russian dialogues, the persona of the first interlocutor, and a summary of the dialogue generated by GPT-3.5, starting with the first phrase given in the prompt.
The matreshka dataset is a multi task datasey, you can use it for the task of summarizing a dialogue or generating a dialogue. Contains life dialogues and is also filled with facts about the world. The dataset was going to give the interlocutor a human manner of communication.
After generation, some data contained a format that did not match the request, so we stripped the data with regular expressions. Next, we checked for the correct data type in each line, and changed to the correct format if necessary.
authors' telegram channels: [@nadlskom](https://t.me/nadlskom), [@lovedeathtransformers](https://t.me/lovedeathtransformers) |
heliosprime/twitter_dataset_1713138702 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 236933
num_examples: 630
download_size: 144838
dataset_size: 236933
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713138702"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yardeny/tokenized_t5_small_context_len_256 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 14651743334
num_examples: 80462898
download_size: 5410854109
dataset_size: 14651743334
---
# Dataset Card for "tokenized_t5_small_context_len_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lkh9908/UncleanedCombinedHub | ---
dataset_info:
features:
- name: id
dtype: string
- name: abstract
dtype: string
- name: highlights
dtype: string
splits:
- name: train
num_bytes: 44756951
num_examples: 29932
download_size: 24130001
dataset_size: 44756951
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_DreadPoor__BagelLake-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/BagelLake-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/BagelLake-7B-slerp](https://huggingface.co/DreadPoor/BagelLake-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__BagelLake-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T15:59:28.200270](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__BagelLake-7B-slerp/blob/main/results_2024-02-10T15-59-28.200270.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6459222271769905,\n\
\ \"acc_stderr\": 0.032303779328089297,\n \"acc_norm\": 0.6472329006433842,\n\
\ \"acc_norm_stderr\": 0.03296936485487387,\n \"mc1\": 0.48225214198286415,\n\
\ \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6375898149834941,\n\
\ \"mc2_stderr\": 0.015466648799208926\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n\
\ \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6734714200358495,\n\
\ \"acc_stderr\": 0.004679847503411344,\n \"acc_norm\": 0.8507269468233419,\n\
\ \"acc_norm_stderr\": 0.0035562912320503525\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7580645161290323,\n \"acc_stderr\": 0.0243625996930311,\n \"acc_norm\"\
: 0.7580645161290323,\n \"acc_norm_stderr\": 0.0243625996930311\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n\
\ \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n\
\ \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389087,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389087\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.016242028834053623,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.016242028834053623\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.02536060379624256,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.02536060379624256\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031218,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031218\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n\
\ \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6375898149834941,\n\
\ \"mc2_stderr\": 0.015466648799208926\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5739196360879454,\n \
\ \"acc_stderr\": 0.013621144396086707\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/BagelLake-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|arc:challenge|25_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|gsm8k|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hellaswag|10_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-59-28.200270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T15-59-28.200270.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- '**/details_harness|winogrande|5_2024-02-10T15-59-28.200270.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T15-59-28.200270.parquet'
- config_name: results
data_files:
- split: 2024_02_10T15_59_28.200270
path:
- results_2024-02-10T15-59-28.200270.parquet
- split: latest
path:
- results_2024-02-10T15-59-28.200270.parquet
---
# Dataset Card for Evaluation run of DreadPoor/BagelLake-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/BagelLake-7B-slerp](https://huggingface.co/DreadPoor/BagelLake-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__BagelLake-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T15:59:28.200270](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__BagelLake-7B-slerp/blob/main/results_2024-02-10T15-59-28.200270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6459222271769905,
"acc_stderr": 0.032303779328089297,
"acc_norm": 0.6472329006433842,
"acc_norm_stderr": 0.03296936485487387,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6375898149834941,
"mc2_stderr": 0.015466648799208926
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038167
},
"harness|hellaswag|10": {
"acc": 0.6734714200358495,
"acc_stderr": 0.004679847503411344,
"acc_norm": 0.8507269468233419,
"acc_norm_stderr": 0.0035562912320503525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.0243625996930311,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.0243625996930311
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507337,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389087,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389087
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053623,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053623
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.02536060379624256,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.02536060379624256
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031218,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031218
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6375898149834941,
"mc2_stderr": 0.015466648799208926
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
},
"harness|gsm8k|5": {
"acc": 0.5739196360879454,
"acc_stderr": 0.013621144396086707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/squad_v2_1000_0.90_id | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 122370541.34995362
num_examples: 70448
- name: validation
num_bytes: 1920159
num_examples: 1000
download_size: 5249130
dataset_size: 124290700.34995362
---
# Dataset Card for "squad_v2_1000_0.90_id"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NimaBoscarino/fuego-20230224-003323-ca1442 | ---
tags:
- fuego
fuego:
id: 20230224-003323-ca1442
status: preparing
script: train.py
requirements_file: requirements.txt
space_id: NimaBoscarino/fuego-20230224-003323-ca1442
space_hardware: cpu-basic
---
|
tj-solergibert/t5-Europarl-en | ---
dataset_info:
features:
- name: dest_lang
dtype:
class_label:
names:
'0': de
'1': en
'2': es
'3': fr
'4': it
'5': nl
'6': pl
'7': pt
'8': ro
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 417227830
num_examples: 561067
- name: test
num_bytes: 61238209
num_examples: 80606
- name: valid
num_bytes: 57930051
num_examples: 76911
download_size: 125777513
dataset_size: 536396090
---
# Dataset Card for "t5-Europarl-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Carlosgg14/gokublack | ---
license: openrail
---
|
oscarlaird/introductions | ---
dataset_info:
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 1686.0
num_examples: 10
- name: validation
num_bytes: 843.0
num_examples: 5
- name: test
num_bytes: 843.0
num_examples: 5
download_size: 9238
dataset_size: 3372.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
unigram/fol-03 | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: proof
dtype: string
- name: premise_tptp
dtype: string
- name: hypothesis_tptp
dtype: string
- name: deberta_pred
dtype: string
- name: deberta_pred_r1_label
dtype: string
- name: deberta_pred_r2_label
dtype: string
- name: deberta_pred_r3_label
dtype: string
splits:
- name: train
num_bytes: 42411290
num_examples: 4791
- name: validation
num_bytes: 5879894
num_examples: 672
- name: test
num_bytes: 5093895
num_examples: 636
download_size: 9159694
dataset_size: 53385079
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/4e48e34a | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1336
dataset_size: 188
---
# Dataset Card for "4e48e34a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
galthran/banks_sentenized | ---
dataset_info:
features:
- name: text
dtype: string
- name: triplets
sequence:
sequence: string
- name: triplets_str
dtype: string
splits:
- name: train
num_bytes: 5006701
num_examples: 22551
- name: val
num_bytes: 1146054
num_examples: 5229
- name: test
num_bytes: 1094646
num_examples: 4996
download_size: 3538132
dataset_size: 7247401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
maykcaldas/smiles-transformers | ---
license: mit
language:
- en
pretty_name: smiles-transformer-dataset
size_categories:
- 100M<n<1B
dataset_info:
features:
- name: text
dtype: string
- name: formula
dtype: string
- name: NumHDonors
dtype: int64
- name: NumHAcceptors
dtype: int64
- name: MolLogP
dtype: float64
- name: NumHeteroatoms
dtype: int64
- name: RingCount
dtype: int64
- name: NumRotatableBonds
dtype: int64
- name: NumAromaticBonds
dtype: int64
- name: NumAcidGroups
dtype: int64
- name: NumBasicGroups
dtype: int64
- name: Apol
dtype: float64
splits:
- name: train
num_bytes: 136431671689
num_examples: 908086717
- name: test
num_bytes: 7437928022
num_examples: 50487919
- name: validation
num_bytes: 7621324737
num_examples: 50605067
download_size: 34998665406
dataset_size: 151490924448
---
# smiles-transformers dataset
TODO: Add references to the datasets we curated
## dataset features
- name: text
- Molecule SMILES : string
- name: formula
- Molecular formula : string
- name: NumHDonors
- Number of hidrogen bond donors : int
- name: NumHAcceptors
- Number of hidrogen bond acceptors : int
- name: MolLogP
- Wildman-Crippen LogP : float
- name: NumHeteroatoms
- Number of hetero atoms: int
- name: RingCount
- Number of rings : int
- name: NumRotatableBonds
- Number of rotable bonds : int
- name: NumAromaticBonds
- Number of aromatic bonds : int
- name: NumAcidGroups
- Number of acid groups : int
- name: NumBasicGroups
- Number of basic groups : int
- name: Apol
## citation information |
CyberHarem/ltlx7000_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ltlx7000/LTLX7000/LTLX7000 (Girls' Frontline)
This is the dataset of ltlx7000/LTLX7000/LTLX7000 (Girls' Frontline), containing 30 images and their tags.
The core tags of this character are `long_hair, breasts, red_eyes, bangs, mole, large_breasts, ahoge, mole_on_breast, braid, hair_between_eyes, green_hair, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 46.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ltlx7000_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 24.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ltlx7000_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 75 | 52.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ltlx7000_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 40.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ltlx7000_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 75 | 76.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ltlx7000_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ltlx7000_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, closed_mouth, blush, looking_at_viewer, simple_background, solo, bare_shoulders, cleavage, hair_ribbon, white_background, hair_ornament, one-piece_swimsuit, aqua_hair, ass, collarbone, earrings, grey_hair, open_jacket, thighs, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | blush | looking_at_viewer | simple_background | solo | bare_shoulders | cleavage | hair_ribbon | white_background | hair_ornament | one-piece_swimsuit | aqua_hair | ass | collarbone | earrings | grey_hair | open_jacket | thighs | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:--------------------|:--------------------|:-------|:-----------------|:-----------|:--------------|:-------------------|:----------------|:---------------------|:------------|:------|:-------------|:-----------|:------------|:--------------|:---------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Tonyhacker/Don_Cannelloni_Dataset | ---
license: openrail
---
|
Pfel/Energie1 | ---
license: llama2
---
|
germank/hh-rlhf_with_features | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: human
dtype: string
- name: assistant_chosen
dtype: string
- name: assistant_rejected
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 14281240
num_examples: 9574
- name: test
num_bytes: 14225165
num_examples: 9574
download_size: 15456243
dataset_size: 28506405
---
# Dataset Card for "hh-rlhf_with_features"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hoangvanvietanh/user_da91d399b47141ccaa812c8b16e8c380_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 7498465.0
num_examples: 13
download_size: 7501642
dataset_size: 7498465.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zhangshuoming/aligned_c_x86_O0_exebench_10k_json_cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 16767029.979
num_examples: 7833
download_size: 2124503
dataset_size: 16767029.979
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "aligned_c_x86_O0_exebench_10k_json_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/cvasnlp_sample_test_augmented | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': abyssinian
'1': american bulldog
'2': american pit bull terrier
'3': basset hound
'4': beagle
'5': bengal
'6': birman
'7': bombay
'8': boxer
'9': british shorthair
'10': chihuahua
'11': egyptian mau
'12': english cocker spaniel
'13': english setter
'14': german shorthaired
'15': great pyrenees
'16': havanese
'17': japanese chin
'18': keeshond
'19': leonberger
'20': maine coon
'21': miniature pinscher
'22': newfoundland
'23': persian
'24': pomeranian
'25': pug
'26': ragdoll
'27': russian blue
'28': saint bernard
'29': samoyed
'30': scottish terrier
'31': shiba inu
'32': siamese
'33': sphynx
'34': staffordshire bull terrier
'35': wheaten terrier
'36': yorkshire terrier
- name: id
dtype: int64
- name: tags
sequence: string
- name: attributes
sequence: string
- name: caption
dtype: string
- name: intensive_captions
sequence: string
splits:
- name: test
num_bytes: 183727.0
num_examples: 10
download_size: 162139
dataset_size: 183727.0
---
# Dataset Card for "cvasnlp_sample_test_augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nicolas-BZRD/uld_loss_Mistral-7B-Instruct-v0.2-qed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
list:
- name: end
dtype: int64
- name: start
dtype: int64
- name: string
dtype: string
- name: answers_generated
dtype: string
splits:
- name: train
num_bytes: 5356277
num_examples: 7009
- name: validation
num_bytes: 467101
num_examples: 610
download_size: 3844501
dataset_size: 5823378
---
# Dataset Card for "uld_loss_Mistral-7B-Instruct-v0.2-qed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jkwiatkowski/raven_properties | ---
dataset_info:
features:
- name: Description
dtype: string
splits:
- name: train
num_bytes: 7234653
num_examples: 42000
- name: val
num_bytes: 2410755
num_examples: 14000
- name: test
num_bytes: 2412471
num_examples: 14000
download_size: 997897
dataset_size: 12057879
---
# Dataset Card for "raven_properties"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_zh_dev_v1.1 | ---
pretty_name: '`mmarco/zh/dev/v1.1`'
viewer: false
source_datasets: ['irds/mmarco_zh', 'irds/mmarco_zh_dev']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/zh/dev/v1.1`
The `mmarco/zh/dev/v1.1` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/zh/dev/v1.1).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,093
- For `docs`, use [`irds/mmarco_zh`](https://huggingface.co/datasets/irds/mmarco_zh)
- For `qrels`, use [`irds/mmarco_zh_dev`](https://huggingface.co/datasets/irds/mmarco_zh_dev)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_zh_dev_v1.1', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
lectura/dagit2-fake-detection | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: prompt_name
dtype: string
- name: source
dtype: string
- name: RDizzl3_seven
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: small_train
num_bytes: 6733440
num_examples: 3000
- name: train
num_bytes: 100255071
num_examples: 43868
- name: test
num_bytes: 2256359
num_examples: 1000
download_size: 57330696
dataset_size: 109244870
configs:
- config_name: default
data_files:
- split: small_train
path: data/small_train-*
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Datasets
* small_train: 3000 rows (0: 1500, 1: 1500) (subset of train)
* train: 43868 rows (0: 26871, 1: 16997)
* test: 1000 rows (0: 500, 1: 500)
# label
* 0: human
* 1: ai |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.