datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
betterMateusz/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RAHULssastae/wikisqlqanda | ---
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 9802871
num_examples: 56355
- name: validation
num_bytes: 2755625
num_examples: 15878
download_size: 3795526
dataset_size: 12558496
---
# Dataset Card for "wikisqlQandA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kernelmachine/open-license-corpus | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
pretty_name: pubtext
size_categories:
- 100B<n<1T
---
# PubText
Welcome to the Open License Corpus (OLC), a 228B token corpus for training permissively-licensed language models.
**Disclaimer**: OLC should not be considered a universally safe-to-use dataset. We encourage users of OLC to consult a legal professional on the suitability of each data source for their application.
## Dataset Description
- **Repository:** [Silo LM repository](https://github.com/kernelmachine/silo-lm)
- **Paper:** [Silo LM paper](https://github.com/kernelmachine/silo-lm)
- **Point of Contact:** [Suchin Gururangan](mailto:sg01@cs.washington.edu)
### Dataset Summary
| Domain | Sources | Specific License | # BPE Tokens (in billions; GPT-NeoX tokenizer) |
|--------------|------------------------------------------------------|------------------|------------------|
| Legal | Case Law, Pile of Law (PD subset) | Public Domain | 27.1 |
| Legal | Pile of Law (CC BY-SA subset) | CC BY-SA | 0.07 |
| Code | Github (permissive) | MIT/BSD/Apache | 58.9 |
| Conversational| HackerNews, Ubuntu IRC | MIT/Apache | 5.9 |
| Conversational | Stack Overflow, Stack Exchange | CC BY-SA | 21.3 |
| Math | Deepmind Math, AMPS | Apache | 3.5 |
| Science | ArXiv abstracts, S2ORC (PD subset) | Public Domain | 1.2 |
| Science | S2ORC (CC BY-SA subset) | CC BY-SA | 70.3 |
| Books | Gutenberg | Public Domain | 2.9 |
| News | Public domain news | Public Domain | 0.2 |
| News | Wikinews | CC BY-SA | 0.01 |
| Encyclopedic | Wikipedia | CC BY-SA | 37.0 |
### Supported Tasks and Leaderboards
- `text-generation`: The dataset can be used to train a language model for text generation. The language model performance is evaluated based on perplexity.
### Languages
OLC is primarily an English-language dataset, but also contains some data in other languages (primarily in the Wikipedia subset, which draws on the [Red Pajama](https://github.com/togethercomputer/RedPajama-Data) data collection)
## Dataset Structure
The dataset is a standard text-only structure, separated into each subset that we include in the paper.
```
from datasets import load_dataset
dataset = load_dataset('kernelmachine/open-license-corpus', 'pd_law', streaming=True)['train']
```
To use a collection of sources, you should specify each individually and interleave, like so:
```
from datasets import interleave_datasets, load_dataset
d1 = load_dataset('kernelmachine/open-license-corpus', 'pd_law', streaming=True)['train']
d2 = load_dataset('kernelmachine/open-license-corpus', 'sw_github', streaming=True)['train']
d1_d2 = interleave_datasets([d1,d2], probabilities=[0.8, 0.2], seed=42)
```
### Data Instances and Fields
The dataset is standard text only structure, e.g. `{"text": "this is a document"}`. We do not add any other fields to documents.
### Data Splits
We only include the training data in this repository.
For validation data, in the paper we use the Pile validation data, which we decontaminate OLC against using a deduplication script (see more below).
The Pile validation data that we use in the paper can be found [here]().
## Dataset Creation
### License Taxonomy
* **Public Domain (PD):** Public domain text has no restrictions.
* **Permissively licensed software (SW):** including MIT, Apache, and BSD software.
* **Attribution licenses (BY):** such as Creative Commons Attribution (CC-BY) are free to use as long as "credit is given to the creator."
* **All other data:** that is not in one of the above three categories is assumed to be non-permissive. This includes: any text that is explicitly protected by copyright or licenses that are non-commercial (e.g., CC-NC), any software without clear MIT, BSD, or Apache licenses, and any generic web-crawled data where the license or copyright information may be unclear.
### Building OLC
Based on this taxonomy of licenses OLC, a 228B token corpus of PD, SW, and BY data. OLC consists of 17 manually-selected sources of
primarily English text that are under permissive licenses.
The text generally falls into eight different domains:
* **Legal:** We curate legal text from the Pile of Law, an amalgation of 31 different sources of text related to civil court cases, patents, and other legal and governmental works, either licensed as public domain or CC-BY. We also gather public domain text from the Case Law Access Project, which covers over 6.5 million decisions published by state and federal courts throughout U.S. history.
* **Code:** We use the Github subset of the RedPajama dataset, which contains code from Github repositories with three permissive software licenses: MIT, Apache, and BSD.
* **Conversation:** We source conversational text under permissive software licenses from the HackerNews (MIT license) and the Ubuntu IRC (Apache license) subsets of the Pile. We also use the Stackexchange subset of the RedPajama dataset and a Stackoverflow corpus from Kaggle, both under the CC-BY-SA license.
* **Math:** We source mathematical text from the Deepmind Mathematics and the AMPS datasets, both of which are under the Apache license.
* **Science:** We source scientific text from ArXiv abstracts that are in the public domain. We also collect full-text articles from the Semantic Scholar Research Corpus (S2ORC), either licensed as public domain or CC-BY.
* **Books:** We source books from the Gutenberg corpus, which are copyright-expired books that are in the public domain.
* **News:** We collect public domain news text from the English subset of the MOT corpus. We also collect text from Wikinews, which is under CC BY-SA.
* **Encyclopedic:** Finally, we include a large set of Wikipedia from the subset included in RedPajama.We follow RedPajama in using Wikipedia snapshots from 20 languages even though the model primarily focuses on English.
#### Initial Data Collection and Normalization
We deduplicate text using a document-level filter that considers $n$-gram overlap. We first deduplicate within each domain to remove redundant documents from similar sources (e.g. Case Law and the Pile of Law), and then then perform deduplication against the validation and test datasets of the Pile to avoid test leakage.
We do not perform any additional quality filtering, though some subsets (e.g. Github and Wikipedia) are already quality filtered by the original data curators of those subsets.
#### Who are the source language producers?
The source language producers vary by domain; the Legal subset primarily contains governmental documents, while the Github subset contains code repositories written by the public. We refer to each data source for further information.
### Annotations
The dataset does not contain any additional annotations.
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
We do not perform additional filtering to remove personally identifiable information, so it is possible that certain subsets still pose privacy risks despite being permissively licensed.
## Considerations for Using the Data
Please see the disclaimer above. The license associated with a document may be time- and country-dependent Moreover, other legal constraints may prohibit the use of a data source despite a permissive data license. We encourage users of PubText to consult a legal professional on the suitability of each data source for their application.
### Social Impact of Dataset
OLC is the first multidomain, permissively licensed corpus, which can enable language models that align better to data-use regulations such as the fair-use doctrine in the United States and the GPDR in the European Union.
### Discussion of Biases and Limitations
While OLC mitigates copyright and privacy risks, it may exacerbate certain fairness issues, like toxicity towards marginalized groups and racial biases, especially due to the prevalence of older copyright-expired books in the training data.
In addition, OLC relies on explicit metadata to identify licenses, which may lead to underestimates of the amount and diversity of permissively licensed text actually available on the web.
### Dataset Curators
OLC was curated by the authors of SILO language models.
### Licensing Information
We release this corpus under the Apache 2.0 license.
### Citation Information
|
open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1 | ---
pretty_name: Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1](https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T16:26:56.383238](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1/blob/main/results_2023-12-02T16-26-56.383238.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T07_31_46.021134
path:
- '**/details_harness|drop|3_2023-10-13T07-31-46.021134.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T07-31-46.021134.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T07_31_46.021134
path:
- '**/details_harness|gsm8k|5_2023-10-13T07-31-46.021134.parquet'
- split: 2023_12_02T16_26_56.383238
path:
- '**/details_harness|gsm8k|5_2023-12-02T16-26-56.383238.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T16-26-56.383238.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:27:13.663491.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:27:13.663491.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T07_31_46.021134
path:
- '**/details_harness|winogrande|5_2023-10-13T07-31-46.021134.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T07-31-46.021134.parquet'
- config_name: results
data_files:
- split: 2023_07_18T14_27_13.663491
path:
- results_2023-07-18T14:27:13.663491.parquet
- split: 2023_10_13T07_31_46.021134
path:
- results_2023-10-13T07-31-46.021134.parquet
- split: 2023_12_02T16_26_56.383238
path:
- results_2023-12-02T16-26-56.383238.parquet
- split: latest
path:
- results_2023-12-02T16-26-56.383238.parquet
---
# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1](https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T16:26:56.383238](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1/blob/main/results_2023-12-02T16-26-56.383238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dms2ect/wikipedia_character_abstracts | ---
license: apache-2.0
---
|
MaxReynolds/cifar10_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': automobile
'2': bird
'3': cat
'4': deer
'5': dog
'6': frog
'7': horse
'8': ship
'9': truck
splits:
- name: train
num_bytes: 113648310.0
num_examples: 50000
download_size: 119709270
dataset_size: 113648310.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cifar10_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperbPrivate/NoiseDetectionMusic_VoxcelebMusan | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 3088864834.0
num_examples: 24000
- name: validation
num_bytes: 671571520.0
num_examples: 5218
- name: test
num_bytes: 1254594428.0
num_examples: 9748
download_size: 5004030900
dataset_size: 5015030782.0
---
# Dataset Card for "NoiseDetectionmusic_VoxcelebMusan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/nfcorpus_train_video | ---
pretty_name: '`nfcorpus/train/video`'
viewer: false
source_datasets: ['irds/nfcorpus']
task_categories:
- text-retrieval
---
# Dataset Card for `nfcorpus/train/video`
The `nfcorpus/train/video` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/nfcorpus#nfcorpus/train/video).
# Data
This dataset provides:
- `queries` (i.e., topics); count=812
- `qrels`: (relevance assessments); count=27,465
- For `docs`, use [`irds/nfcorpus`](https://huggingface.co/datasets/irds/nfcorpus)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/nfcorpus_train_video', 'queries')
for record in queries:
record # {'query_id': ..., 'title': ..., 'desc': ...}
qrels = load_dataset('irds/nfcorpus_train_video', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Boteva2016Nfcorpus,
title="A Full-Text Learning to Rank Dataset for Medical Information Retrieval",
author = "Vera Boteva and Demian Gholipour and Artem Sokolov and Stefan Riezler",
booktitle = "Proceedings of the European Conference on Information Retrieval ({ECIR})",
location = "Padova, Italy",
publisher = "Springer",
year = 2016
}
```
|
ccmusic-database/chest_falsetto | ---
license: mit
task_categories:
- audio-classification
language:
- zh
- en
tags:
- music
- art
pretty_name: Chest voice and Falsetto Dataset
size_categories:
- 1K<n<10K
viewer: false
---
# Dataset Card for Chest voice and Falsetto Dataset
The raw dataset comprises 1,280 monophonic singing audio files in .wav format (sample rate is 22,050 Hz), consisting of chest and falsetto voices performed, recorded, and annotated by students majoring in Vocal Music at the China Conservatory of Music. The chest voice is tagged as chest and the falsetto voice is tagged as falsetto. Additionally, the dataset includes the Mel spectrogram, Mel frequency cepstral coefficient (MFCC), and spectral characteristics of each audio segment, resulting in a total of 5,120 CSV files. The original dataset did not differentiate between male and female voices, an omission that is critical for accurately identifying chest and falsetto vocal techniques. To address this, we conducted a meticulous manual review and added gender annotations to the dataset. Besides the original content, the preprocessed version during the evaluation which will be detailed in section IV is also provided. This approach which provides two versions is applied to the two subsequent classification datasets that have not been evaluated as well: Music Genre Dataset, Bel Conto & Chinese Folk Singing Dataset.
### Eval Subset
```python
from datasets import load_dataset
ds = load_dataset("ccmusic-database/chest_falsetto", name="eval")
for item in ds["train"]:
print(item)
for item in ds["validation"]:
print(item)
for item in ds["test"]:
print(item)
```
### Raw Subset
```python
from datasets import load_dataset
ds = load_dataset("ccmusic-database/chest_falsetto", name="default")
for item in ds["train"]:
print(item)
for item in ds["validation"]:
print(item)
for item in ds["test"]:
print(item)
```
## Maintenance
```bash
GIT_LFS_SKIP_SMUDGE=1 git clone git@hf.co:datasets/ccmusic-database/chest_falsetto
cd chest_falsetto
```
## Dataset Description
- **Homepage:** <https://ccmusic-database.github.io>
- **Repository:** <https://huggingface.co/datasets/ccmusic-database/chest_falsetto>
- **Paper:** <https://doi.org/10.5281/zenodo.5676893>
- **Leaderboard:** <https://ccmusic-database.github.io/team.html>
- **Point of Contact:** <https://www.modelscope.cn/datasets/ccmusic/chest_falsetto>
### Dataset Summary
For the pre-processed version, the audio clip was into 0.25 seconds and then transformed to Mel, CQT and Chroma spectrogram in .jpg format, resulting in 8,974 files. The chest/falsetto label for each file is given as one of the four classes: m chest, m falsetto, f chest, and f falsetto. The spectrogram, the chest/falsetto label and the gender label are combined into one data entry, with the first three columns representing the Mel, CQT and Chroma. The fourth and fifth columns are the chest/falsetto label and gender label, respectively. Additionally, the integrated dataset provides the function to shuffle and split the dataset into training, validation, and test sets in an 8:1:1 ratio. This dataset can be used for singing-related tasks such as singing gender classification or chest and falsetto voice classification.
### Supported Tasks and Leaderboards
Audio classification, singing method classification, voice classification
### Languages
Chinese, English
## Dataset Structure
<style>
.datastructure td {
vertical-align: middle !important;
text-align: center;
}
.datastructure th {
text-align: center;
}
</style>
### Eval Subset
<table class="datastructure">
<tr>
<th>mel(.jpg, 48000Hz)</th>
<th>cqt(.jpg, 48000Hz)</th>
<th>chroma(.jpg, 48000Hz)</th>
<th>label</th>
<th>gender</th>
<th>singing_method</th>
</tr>
<tr>
<td><img src="./data/W8wy7pkYZtCt3lI5Oq39l.jpeg"></td>
<td><img src="./data/48qPVDDIZe0ttsYXrTJEh.jpeg"></td>
<td><img src="./data/zm0KorKYtmvOje8qmivHJ.jpeg"></td>
<td>m_chest, m_falsetto, f_chest, f_falsetto</td>
<td>male, female</td>
<td>chest, falsetto</td>
</tr>
<tr>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
</tr>
</table>
### Raw Subset
<table class="datastructure">
<tr>
<th>audio(.wav, 22050Hz)</th>
<th>mel(spectrogram, .jpg, 22050Hz)</th>
<th>label(4-class)</th>
<th>gender(2-class)</th>
<th>singing_method(2-class)</th>
</tr>
<tr>
<td><audio controls src="https://cdn-uploads.huggingface.co/production/uploads/655e0a5b8c2d4379a71882a9/LKSBb11kCyPl15b-DJo6V.wav"></audio></td>
<td><img src="./data/0001_m_chest.jpg"></td>
<td>m_chest, m_falsetto, f_chest, f_falsetto</td>
<td>male, female</td>
<td>chest, falsetto</td>
</tr>
<tr>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
</tr>
</table>
### Data Instances
.zip(.wav, .jpg)
### Data Fields
m_chest, f_chest, m_falsetto, f_falsetto
### Data Splits
| Split | Eval | Raw |
| :-------------: | :---: | :---: |
| total | 8974 | 1280 |
| train(80%) | 7179 | 1024 |
| validation(10%) | 897 | 128 |
| test(10%) | 898 | 128 |
## Dataset Creation
### Curation Rationale
Lack of a dataset for Chest voice and Falsetto
### Source Data
#### Initial Data Collection and Normalization
Zhaorui Liu, Monan Zhou
#### Who are the source language producers?
Students from CCMUSIC
### Annotations
#### Annotation process
1280 monophonic singing audio (.wav format) of chest and falsetto voices, with chest voice tagged as _chest_ and falsetto voice tagged as _falsetto_.
#### Who are the annotators?
Students from CCMUSIC
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
Promoting the development of AI in the music industry
### Discussion of Biases
Only for chest and falsetto voices
### Other Known Limitations
Recordings are cut into slices that are too short;
The CQT spectrum column has the problem of spectrum leakage, but because the original audio slice is too short, only 0.5s, it cannot effectively avoid this problem.
## Additional Information
### Dataset Curators
Zijin Li
### Evaluation
<https://huggingface.co/ccmusic-database/chest_falsetto>
### Licensing Information
```
MIT License
Copyright (c) CCMUSIC
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
### Citation Information
```bibtex
@dataset{zhaorui_liu_2021_5676893,
author = {Monan Zhou, Shenyang Xu, Zhaorui Liu, Zhaowen Wang, Feng Yu, Wei Li and Baoqiang Han},
title = {CCMusic: an Open and Diverse Database for Chinese and General Music Information Retrieval Research},
month = {mar},
year = {2024},
publisher = {HuggingFace},
version = {1.2},
url = {https://huggingface.co/ccmusic-database}
}
```
### Contributions
Provide a dataset for distinguishing chest and falsetto voices |
autoevaluate/autoeval-eval-samsum-samsum-61187c-1532155205 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum_V2
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: train
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum_V2
* Dataset: samsum
* Config: samsum
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@samuelallen123](https://huggingface.co/samuelallen123) for evaluating this model. |
Denliner/LoRA | ---
license: openrail
---
|
CyberHarem/gr_mg23_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gr_mg23/GrMG23/HK23 (Girls' Frontline)
This is the dataset of gr_mg23/GrMG23/HK23 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `breasts, double_bun, hair_bun, long_hair, blonde_hair, large_breasts, purple_eyes, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 14.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 17.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 12.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 26.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_mg23_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, blush, gloves, looking_at_viewer, long_sleeves, open_mouth, white_background, black_skirt, black_thighhighs, pleated_skirt, black_jacket, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | gloves | looking_at_viewer | long_sleeves | open_mouth | white_background | black_skirt | black_thighhighs | pleated_skirt | black_jacket | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------|:--------------------|:---------------|:-------------|:-------------------|:--------------|:-------------------|:----------------|:---------------|:--------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
distil-whisper/librispeech_asr_dummy-concatenated | ---
dataset_info:
config_name: clean
features:
- name: text
dtype: string
- name: id
dtype: string
- name: input_features
sequence:
sequence: float32
splits:
- name: validation
num_bytes: 26127227
num_examples: 17
download_size: 21173882
dataset_size: 26127227
configs:
- config_name: clean
data_files:
- split: validation
path: clean/validation-*
---
|
liuyanchen1015/MULTI_VALUE_mrpc_zero_plural | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 296532
num_examples: 1082
- name: train
num_bytes: 645432
num_examples: 2353
- name: validation
num_bytes: 70660
num_examples: 255
download_size: 668737
dataset_size: 1012624
---
# Dataset Card for "MULTI_VALUE_mrpc_zero_plural"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Arrivedercis/finreport-llama2-5k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2293425
num_examples: 10000
download_size: 1144776
dataset_size: 2293425
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "finreport-llama2-5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
a-asad/combinedFarsiQuADs | ---
language:
- fa
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 244809667
num_examples: 164506
- name: validation
num_bytes: 27887637
num_examples: 16997
download_size: 34751548
dataset_size: 272697304
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
combine multiple QuAD datasets to create new Dataset simillar to [squad_v2](https://huggingface.co/datasets/squad_v2) dataset:
- [PersianQA](https://github.com/sajjjadayobi/PersianQA)
- [PQuAD](https://github.com/AUT-NLP/PQuAD)
- [ParSQuAD](https://github.com/BigData-IsfahanUni/ParSQuAD)
- [PersianQuAD](https://github.com/BigData-IsfahanUni/PersianQuAD) |
DialogueCharacter/chinese_dialogue_instruction_with_reward_score_judged_by_13B_baichuan2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: reward_score
dtype: float64
splits:
- name: train
num_bytes: 144603592
num_examples: 110670
download_size: 83071987
dataset_size: 144603592
---
# Dataset Card for "chinese_dialogue_instruction_with_reward_score_judged_by_13B_baichuan2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marvinmedeiros52/sergioperdigao | ---
license: openrail
---
|
adalib/beatnum-sub-cond-gen | ---
dataset_info:
features:
- name: code
dtype: string
splits:
- name: train
num_bytes: 8570022
num_examples: 615
download_size: 3082500
dataset_size: 8570022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kochanoskill/xaya | ---
license: openrail
---
|
open-llm-leaderboard/details_DatPySci__pythia-1b-kto-iter0 | ---
pretty_name: Evaluation run of DatPySci/pythia-1b-kto-iter0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DatPySci/pythia-1b-kto-iter0](https://huggingface.co/DatPySci/pythia-1b-kto-iter0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DatPySci__pythia-1b-kto-iter0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T15:39:19.717013](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-kto-iter0/blob/main/results_2024-02-29T15-39-19.717013.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24774728556136447,\n\
\ \"acc_stderr\": 0.03041175731725549,\n \"acc_norm\": 0.24901099168770932,\n\
\ \"acc_norm_stderr\": 0.031154173830246844,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396731,\n \"mc2\": 0.3640465429999277,\n\
\ \"mc2_stderr\": 0.014283399348703093\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2773037542662116,\n \"acc_stderr\": 0.013082095839059374,\n\
\ \"acc_norm\": 0.30119453924914674,\n \"acc_norm_stderr\": 0.013406741767847627\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38727345150368453,\n\
\ \"acc_stderr\": 0.004861314613286841,\n \"acc_norm\": 0.48954391555467036,\n\
\ \"acc_norm_stderr\": 0.00498869022950566\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101456,\n\
\ \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101456\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741713,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.02834696377716246,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.02834696377716246\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.1793103448275862,\n \"acc_stderr\": 0.03196766433373186,\n\
\ \"acc_norm\": 0.1793103448275862,\n \"acc_norm_stderr\": 0.03196766433373186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208837,\n \"\
acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208837\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2315270935960591,\n \"acc_stderr\": 0.02967833314144444,\n \"\
acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.02967833314144444\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.02805779167298901,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.02805779167298901\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.03119584087770031,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.03119584087770031\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279483,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279483\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3174311926605505,\n \"acc_stderr\": 0.019957152198460497,\n \"\
acc_norm\": 0.3174311926605505,\n \"acc_norm_stderr\": 0.019957152198460497\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298804,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298804\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3273542600896861,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.3273542600896861,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.19631901840490798,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.19631901840490798,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891165,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891165\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n\
\ \"acc_stderr\": 0.016073127851221246,\n \"acc_norm\": 0.280970625798212,\n\
\ \"acc_norm_stderr\": 0.016073127851221246\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044276,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044276\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961441,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961441\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087866,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.025122637608816632,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.025122637608816632\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.028501452860396567,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.028501452860396567\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17551020408163265,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.17551020408163265,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.23493975903614459,\n\
\ \"acc_stderr\": 0.03300533186128922,\n \"acc_norm\": 0.23493975903614459,\n\
\ \"acc_norm_stderr\": 0.03300533186128922\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396731,\n \"mc2\": 0.3640465429999277,\n\
\ \"mc2_stderr\": 0.014283399348703093\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5311760063141279,\n \"acc_stderr\": 0.014025142640639513\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.003447819272388999\n }\n}\n```"
repo_url: https://huggingface.co/DatPySci/pythia-1b-kto-iter0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|arc:challenge|25_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|gsm8k|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hellaswag|10_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-39-19.717013.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T15-39-19.717013.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- '**/details_harness|winogrande|5_2024-02-29T15-39-19.717013.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T15-39-19.717013.parquet'
- config_name: results
data_files:
- split: 2024_02_29T15_39_19.717013
path:
- results_2024-02-29T15-39-19.717013.parquet
- split: latest
path:
- results_2024-02-29T15-39-19.717013.parquet
---
# Dataset Card for Evaluation run of DatPySci/pythia-1b-kto-iter0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-kto-iter0](https://huggingface.co/DatPySci/pythia-1b-kto-iter0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DatPySci__pythia-1b-kto-iter0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T15:39:19.717013](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-kto-iter0/blob/main/results_2024-02-29T15-39-19.717013.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24774728556136447,
"acc_stderr": 0.03041175731725549,
"acc_norm": 0.24901099168770932,
"acc_norm_stderr": 0.031154173830246844,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396731,
"mc2": 0.3640465429999277,
"mc2_stderr": 0.014283399348703093
},
"harness|arc:challenge|25": {
"acc": 0.2773037542662116,
"acc_stderr": 0.013082095839059374,
"acc_norm": 0.30119453924914674,
"acc_norm_stderr": 0.013406741767847627
},
"harness|hellaswag|10": {
"acc": 0.38727345150368453,
"acc_stderr": 0.004861314613286841,
"acc_norm": 0.48954391555467036,
"acc_norm_stderr": 0.00498869022950566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741713,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.02834696377716246,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.02834696377716246
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1793103448275862,
"acc_stderr": 0.03196766433373186,
"acc_norm": 0.1793103448275862,
"acc_norm_stderr": 0.03196766433373186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.02967833314144444,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.02967833314144444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.03119584087770031,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.03119584087770031
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3174311926605505,
"acc_stderr": 0.019957152198460497,
"acc_norm": 0.3174311926605505,
"acc_norm_stderr": 0.019957152198460497
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298804,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298804
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3273542600896861,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.3273542600896861,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.19631901840490798,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.19631901840490798,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891165,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891165
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.016073127851221246,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.016073127851221246
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044276,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044276
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961441,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961441
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.025122637608816632,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.025122637608816632
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.028501452860396567,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.028501452860396567
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.01774089950917779,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.01774089950917779
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17551020408163265,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.17551020408163265,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.23493975903614459,
"acc_stderr": 0.03300533186128922,
"acc_norm": 0.23493975903614459,
"acc_norm_stderr": 0.03300533186128922
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396731,
"mc2": 0.3640465429999277,
"mc2_stderr": 0.014283399348703093
},
"harness|winogrande|5": {
"acc": 0.5311760063141279,
"acc_stderr": 0.014025142640639513
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.003447819272388999
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/61d0c3df | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1325
dataset_size: 186
---
# Dataset Card for "61d0c3df"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PyaeSoneK/LegalFewShot | ---
license: openrail
---
|
ere3545/mikeyy2 | ---
license: bigcode-openrail-m
---
|
shmohseni/madani | ---
license: apache-2.0
---
|
awettig/Pile-Github-0.5B-6K-opt | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6487050154
num_examples: 81380
- name: test
num_bytes: 64945692
num_examples: 813
download_size: 1121468368
dataset_size: 6551995846
---
# Dataset Card for "Pile-Github-0.5B-6K-opt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phatjk/wikipedia_vi_qa | ---
dataset_info:
features:
- name: text
dtype: string
- name: question
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 8523200
num_examples: 20107
download_size: 4759406
dataset_size: 8523200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wikipedia_vi_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Melody0025/datasettest001 | ---
license: afl-3.0
---
|
davidgaofc/techdebt_label | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Diff
dtype: string
- name: Message
dtype: string
splits:
- name: train
num_bytes: 5042681
num_examples: 8793
- name: test
num_bytes: 1259287
num_examples: 2199
download_size: 1824024
dataset_size: 6301968
---
# Dataset Card for "techdebt_label"
This dataset was generated from [The Technical Debt Dataset](https://github.com/clowee/The-Technical-Debt-Dataset) created by Lenarduzzi, et al. and the citation is down below.
## Dataset Details and Structure
The labels for the dataset were provided by the SonarQube software cited by the paper and matched to the diff in the commit where the message was raised. This diff was then cleaned to only include the lines of code added.
## Bias, Risks, and Limitations
Beware of the limited sample size and label variety in the dataset. Also, the queries used to extract this data are still being checked over to ensure correctness.
## Recommendations
Changes are constantly being made to this dataset to make it better. Please be aware when you use it.
## References
Valentina Lenarduzzi, Nyyti Saarimäki, Davide Taibi. The Technical Debt Dataset. Proceedings for the 15th Conference on Predictive Models and Data Analytics in Software Engineering. Brazil. 2019. |
A2H0H0R1/plant-disease-new | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Apple___Apple_scab
'1': Apple___Black_rot
'2': Apple___Cedar_apple_rust
'3': Apple___healthy
'4': Blueberry___healthy
'5': Cherry_(including_sour)___Powdery_mildew
'6': Cherry_(including_sour)___healthy
'7': Corn_(maize)___Cercospora_leaf_spot Gray_leaf_spot
'8': Corn_(maize)___Common_rust_
'9': Corn_(maize)___Northern_Leaf_Blight
'10': Corn_(maize)___healthy
'11': Grape___Black_rot
'12': Grape___Esca_(Black_Measles)
'13': Grape___Leaf_blight_(Isariopsis_Leaf_Spot)
'14': Grape___healthy
'15': Orange___Haunglongbing_(Citrus_greening)
'16': Peach___Bacterial_spot
'17': Peach___healthy
'18': Pepper,_bell___Bacterial_spot
'19': Pepper,_bell___healthy
'20': Potato___Early_blight
'21': Potato___Late_blight
'22': Potato___healthy
'23': Raspberry___healthy
'24': Soybean___healthy
'25': Squash___Powdery_mildew
'26': Strawberry___Leaf_scorch
'27': Strawberry___healthy
'28': Tomato___Bacterial_spot
'29': Tomato___Early_blight
'30': Tomato___Late_blight
'31': Tomato___Leaf_Mold
'32': Tomato___Septoria_leaf_spot
'33': Tomato___Spider_mites Two-spotted_spider_mite
'34': Tomato___Target_Spot
'35': Tomato___Tomato_Yellow_Leaf_Curl_Virus
'36': Tomato___Tomato_mosaic_virus
'37': Tomato___healthy
splits:
- name: train
num_bytes: 2234212851.536
num_examples: 162916
download_size: 2171353111
dataset_size: 2234212851.536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Moneskn/Matrices-Image2Text | ---
dataset_info:
features:
- name: matrix_image
sequence:
sequence: float64
- name: matrix_array
dtype: string
splits:
- name: train
num_bytes: 826657666
num_examples: 3000
download_size: 211299036
dataset_size: 826657666
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- image-to-text
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
This dataset has 3000 image and string representation of 3000 random matrices. the sizes of the matrices are also random, and it ranges from 3x3 to 7x7.
The matrix images are not numpy arrays, so I recommend turning them into arrays while porcessing the data. |
Thanmay/indic-para-hi | ---
dataset_info:
features:
- name: id
dtype: string
- name: pivot
dtype: string
- name: input
dtype: string
- name: target
dtype: string
- name: references
list: string
- name: itv2 hi input
dtype: string
- name: itv2 hi target
dtype: string
- name: itv2 hi references
sequence: string
splits:
- name: test
num_bytes: 1597219
num_examples: 1000
- name: validation
num_bytes: 1634630
num_examples: 1000
download_size: 1350715
dataset_size: 3231849
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
SeyedAli/Persian-Text-NER | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: label_entity_word
dtype: string
splits:
- name: train
num_bytes: 29340224
num_examples: 18434
- name: test
num_bytes: 7348909
num_examples: 4609
download_size: 8801763
dataset_size: 36689133
task_categories:
- token-classification
language:
- fa
--- |
cs492projectgroup/Bias | ---
license: apache-2.0
---
|
CyberHarem/elisabeth_bathory_brave_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of elisabeth_bathory_brave/エリザベート・バートリー〔ブレイブ〕/伊丽莎白·巴托里〔勇者〕 (Fate/Grand Order)
This is the dataset of elisabeth_bathory_brave/エリザベート・バートリー〔ブレイブ〕/伊丽莎白·巴托里〔勇者〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `pink_hair, long_hair, blue_eyes, horns, pointy_ears, tail, dragon_tail, dragon_horns, curled_horns, ribbon, dragon_girl, two_side_up, small_breasts, breasts, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 769.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_brave_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 671.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_brave_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1281 | 1.31 GiB | [Download](https://huggingface.co/datasets/CyberHarem/elisabeth_bathory_brave_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elisabeth_bathory_brave_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, detached_sleeves, dress, looking_at_viewer, smile, solo, asymmetrical_horns, one_eye_closed, open_mouth, simple_background, white_background, ;d, circle_skirt, full_body, knee_boots, uneven_horns, claws, high_heels, holding, microphone_stand, shadow |
| 1 | 10 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, open_mouth, solo, :d, black_dress, simple_background, white_background, blush, hair_ribbon, microphone |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, corset, detached_sleeves, looking_at_viewer, plaid_skirt, solo, simple_background, white_background, open_mouth, :d, hand_on_own_hip, blush, dress |
| 3 | 12 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, plaid_skirt, smile, solo, bare_shoulders, corset, open_mouth, blush, holding_microphone, microphone_stand, one_eye_closed |
| 4 | 7 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, smile, solo, black_dress, holding_weapon, polearm |
| 5 | 13 |  |  |  |  |  | 1girl, detached_sleeves, dress_flower, hat_flower, looking_at_viewer, pink_dress, pink_headwear, pink_rose, solo, striped_headwear, top_hat, vertical-striped_clothes, vertical-striped_dress, holding_microphone, frilled_dress, blush, microphone_stand, pig, sleeveless, squirrel, layered_dress, circle_skirt, open_mouth, hair_between_eyes, polka_dot_dress, simple_background, :d, white_background, closed_mouth, long_sleeves |
| 6 | 10 |  |  |  |  |  | 1girl, solo, witch_hat, detached_sleeves, looking_at_viewer, choker, vertical-striped_clothes, vertical-striped_dress, halloween_costume, jack-o'-lantern, open_mouth, pumpkin, :d, bat_wings, black_thighhighs, demon_tail, earrings, star_(symbol), blush, food, holding, polearm |
| 7 | 5 |  |  |  |  |  | 1girl, blush, collarbone, frilled_bikini, hair_between_eyes, looking_at_viewer, navel, solo, bare_shoulders, simple_background, smile, white_background, cowboy_shot, hair_ribbon, open_mouth, white_bikini, ;d, cleavage, closed_mouth, official_alternate_costume, one_eye_closed, see-through, white_shirt |
| 8 | 5 |  |  |  |  |  | 1girl, bikini_armor, black_thighhighs, gauntlets, looking_at_viewer, pauldrons, red_armor, red_bikini, simple_background, solo, vambraces, white_background, white_cape, blush, navel, open_mouth, silver_trim, smile, tiara, elbow_gloves, arm_up, armored_boots, choker, hair_ribbon, holding_sword, slime_(creature) |
| 9 | 5 |  |  |  |  |  | 1girl, armored_boots, bikini_armor, black_thighhighs, holding_sword, looking_at_viewer, navel, pauldrons, red_armor, silver_trim, tiara, blush, gauntlets, gloves, holding_shield, red_bikini, simple_background, smile, solo, vambraces, white_background, full_body, ass, choker, open_mouth, red_footwear, white_cape |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | dress | looking_at_viewer | smile | solo | asymmetrical_horns | one_eye_closed | open_mouth | simple_background | white_background | ;d | circle_skirt | full_body | knee_boots | uneven_horns | claws | high_heels | holding | microphone_stand | shadow | :d | black_dress | blush | hair_ribbon | microphone | bare_shoulders | corset | plaid_skirt | hand_on_own_hip | holding_microphone | holding_weapon | polearm | dress_flower | hat_flower | pink_dress | pink_headwear | pink_rose | striped_headwear | top_hat | vertical-striped_clothes | vertical-striped_dress | frilled_dress | pig | sleeveless | squirrel | layered_dress | hair_between_eyes | polka_dot_dress | closed_mouth | long_sleeves | witch_hat | choker | halloween_costume | jack-o'-lantern | pumpkin | bat_wings | black_thighhighs | demon_tail | earrings | star_(symbol) | food | collarbone | frilled_bikini | navel | cowboy_shot | white_bikini | cleavage | official_alternate_costume | see-through | white_shirt | bikini_armor | gauntlets | pauldrons | red_armor | red_bikini | vambraces | white_cape | silver_trim | tiara | elbow_gloves | arm_up | armored_boots | holding_sword | slime_(creature) | gloves | holding_shield | ass | red_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:--------------------|:--------|:-------|:---------------------|:-----------------|:-------------|:--------------------|:-------------------|:-----|:---------------|:------------|:-------------|:---------------|:--------|:-------------|:----------|:-------------------|:---------|:-----|:--------------|:--------|:--------------|:-------------|:-----------------|:---------|:--------------|:------------------|:---------------------|:-----------------|:----------|:---------------|:-------------|:-------------|:----------------|:------------|:-------------------|:----------|:---------------------------|:-------------------------|:----------------|:------|:-------------|:-----------|:----------------|:--------------------|:------------------|:---------------|:---------------|:------------|:---------|:--------------------|:------------------|:----------|:------------|:-------------------|:-------------|:-----------|:----------------|:-------|:-------------|:-----------------|:--------|:--------------|:---------------|:-----------|:-----------------------------|:--------------|:--------------|:---------------|:------------|:------------|:------------|:-------------|:------------|:-------------|:--------------|:--------|:---------------|:---------|:----------------|:----------------|:-------------------|:---------|:-----------------|:------|:---------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | | X | | | X | X | X | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | X | | | X | X | X | | | | | | | | | | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | | X | X | X | | X | X | | | | | | | | | | | X | | | | X | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | X | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | X | X | | X | | X | | | X | X | X | | X | | | | | | | X | | X | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | | X | | X | | | X | | | | | | | | | | X | | | X | | X | | | | | | | | | X | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | X | X | | X | X | X | X | X | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | X | X | X | | | X | X | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | X | X | | | X | X | X | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | | | X | X | | X | X | X | X |
|
yuqingXing/leave_data | ---
task_categories:
- image-classification
language:
- en
tags:
- biology
size_categories:
- 10M<n<100M
---
One-hundred plant species leaves dataset. The dataset is derived from this paper: Charles Mallah, James Cope, James Orwell. Plant Leaf Classification Using Probabilistic Integration of Shape, Texture and Margin Features. Signal Processing, Pattern Recognition and Applications, in press. 2013.
(1)Sources:
(a) Original owners of colour Leaves Samples:
James Cope, Thibaut Beghin, Paolo Remagnino, Sarah Barman.
The colour images are not included.
The Leaves were collected in the Royal Botanic Gardens, Kew, UK.
email: james.cope@kingston.ac.uk
(b) This dataset consists of work carried out by James Cope, Charles Mallah, and James Orwell.
(2)Donor of the database:
Charles Mallah, charles.mallah@kingston.ac.uk; James Cope, james.cope@kingston.ac.uk.
(3)Dataset Information:
The original data directory contains the binary images (masks) of the leaf samples (colour images not included). There are three features for each image: Shape, Margin and Texture. For each feature, a 64 element vector is given per leaf sample. These vectors are taken as a contiguous descriptor (for shape) or histograms (for texture and margin). So, there are three different files, one for each feature problem.
Each row has a 64-element feature vector followed by the Class label.
There is a total of 1600 samples with 16 samples per leaf class (100 classes), and no missing values.
‘data_Sha_64.txt’ -> prediction based on shape
‘data_Tex_64.txt’ -> prediction based on texture
‘data_Mar_64.txt’ -> prediction based on margin
(4)References:
[1]Charles Mallah, James Cope, James Orwell. Plant Leaf Classification Using Probabilistic Integration of Shape, Texture and Margin Features. Signal Processing, Pattern Recognition and Applications, in press.
[2]J. Cope, P. Remagnino, S. Barman, and P. Wilkin. Plant texture classification using gabor co-occurrences. Advances in Visual Computing, pages 699-677, 2010.
[3]T. Beghin, J. Cope, P. Remagnino, and S. Barman. Shape and texture based plant leaf classification. In: Advanced Concepts for Intelligent Vision Systems, pages 345-353. Springer, 2010. |
carcanha/marilha | ---
license: openrail
---
|
Hualouz/GraphTranslator-arxiv | ---
license: bsd-3-clause-clear
---
|
Andyrasika/TweetSumm-tuned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: conversation
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2268632
num_examples: 879
- name: validation
num_bytes: 267236
num_examples: 110
- name: test
num_bytes: 296944
num_examples: 110
download_size: 1595884
dataset_size: 2832812
---
# Dataset Card for "TweetSumm-tuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bikesuffer/truck20_selected | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7585913.0
num_examples: 20
download_size: 7583930
dataset_size: 7585913.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_MexIvanov__zephyr-python-ru | ---
pretty_name: Evaluation run of MexIvanov/zephyr-python-ru
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MexIvanov/zephyr-python-ru](https://huggingface.co/MexIvanov/zephyr-python-ru)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MexIvanov__zephyr-python-ru\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T16:26:04.991527](https://huggingface.co/datasets/open-llm-leaderboard/details_MexIvanov__zephyr-python-ru/blob/main/results_2023-12-23T16-26-04.991527.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5991461225741262,\n\
\ \"acc_stderr\": 0.03306015344516284,\n \"acc_norm\": 0.6048288788808908,\n\
\ \"acc_norm_stderr\": 0.033742531689769865,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5280108433994529,\n\
\ \"mc2_stderr\": 0.015317682476455754\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294314,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6224855606452898,\n\
\ \"acc_stderr\": 0.004837744647345717,\n \"acc_norm\": 0.8202549292969528,\n\
\ \"acc_norm_stderr\": 0.0038319023702881065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332786,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332786\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.0253781399708852,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.0253781399708852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365886,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365886\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\
\ \"acc_stderr\": 0.014551310568143698,\n \"acc_norm\": 0.7905491698595147,\n\
\ \"acc_norm_stderr\": 0.014551310568143698\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.17206703910614526,\n\
\ \"acc_stderr\": 0.012623438533220628,\n \"acc_norm\": 0.17206703910614526,\n\
\ \"acc_norm_stderr\": 0.012623438533220628\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297236,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297236\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623343,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623343\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5280108433994529,\n\
\ \"mc2_stderr\": 0.015317682476455754\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827943\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3252463987869598,\n \
\ \"acc_stderr\": 0.01290390475254392\n }\n}\n```"
repo_url: https://huggingface.co/MexIvanov/zephyr-python-ru
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-26-04.991527.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- '**/details_harness|winogrande|5_2023-12-23T16-26-04.991527.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T16-26-04.991527.parquet'
- config_name: results
data_files:
- split: 2023_12_23T16_26_04.991527
path:
- results_2023-12-23T16-26-04.991527.parquet
- split: latest
path:
- results_2023-12-23T16-26-04.991527.parquet
---
# Dataset Card for Evaluation run of MexIvanov/zephyr-python-ru
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MexIvanov/zephyr-python-ru](https://huggingface.co/MexIvanov/zephyr-python-ru) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MexIvanov__zephyr-python-ru",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:26:04.991527](https://huggingface.co/datasets/open-llm-leaderboard/details_MexIvanov__zephyr-python-ru/blob/main/results_2023-12-23T16-26-04.991527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5991461225741262,
"acc_stderr": 0.03306015344516284,
"acc_norm": 0.6048288788808908,
"acc_norm_stderr": 0.033742531689769865,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5280108433994529,
"mc2_stderr": 0.015317682476455754
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294314,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.6224855606452898,
"acc_stderr": 0.004837744647345717,
"acc_norm": 0.8202549292969528,
"acc_norm_stderr": 0.0038319023702881065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332786,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.0253781399708852,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.0253781399708852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365886,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365886
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143698,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143698
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.17206703910614526,
"acc_stderr": 0.012623438533220628,
"acc_norm": 0.17206703910614526,
"acc_norm_stderr": 0.012623438533220628
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297236,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297236
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.019659922493623343,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.019659922493623343
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5280108433994529,
"mc2_stderr": 0.015317682476455754
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827943
},
"harness|gsm8k|5": {
"acc": 0.3252463987869598,
"acc_stderr": 0.01290390475254392
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_yam-peleg__Experiment19-7B | ---
pretty_name: Evaluation run of yam-peleg/Experiment19-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yam-peleg/Experiment19-7B](https://huggingface.co/yam-peleg/Experiment19-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment19-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T18:09:52.338335](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment19-7B/blob/main/results_2024-02-19T18-09-52.338335.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6369774461420744,\n\
\ \"acc_stderr\": 0.03249014878349776,\n \"acc_norm\": 0.6366803430351798,\n\
\ \"acc_norm_stderr\": 0.03316829176635558,\n \"mc1\": 0.6046511627906976,\n\
\ \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7817594211835219,\n\
\ \"mc2_stderr\": 0.013695289301759589\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068749,\n\
\ \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252425\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7001593308105954,\n\
\ \"acc_stderr\": 0.004572515919210699,\n \"acc_norm\": 0.8860784704242183,\n\
\ \"acc_norm_stderr\": 0.003170666122517656\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469546,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469546\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n\
\ \"acc_stderr\": 0.02759917430064077,\n \"acc_norm\": 0.8088235294117647,\n\
\ \"acc_norm_stderr\": 0.02759917430064077\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n\
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464081,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464081\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865464,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457152,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457152\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6046511627906976,\n\
\ \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7817594211835219,\n\
\ \"mc2_stderr\": 0.013695289301759589\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433537\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6542835481425322,\n \
\ \"acc_stderr\": 0.01310042299044157\n }\n}\n```"
repo_url: https://huggingface.co/yam-peleg/Experiment19-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|arc:challenge|25_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|gsm8k|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hellaswag|10_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T18-09-52.338335.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T18-09-52.338335.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- '**/details_harness|winogrande|5_2024-02-19T18-09-52.338335.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T18-09-52.338335.parquet'
- config_name: results
data_files:
- split: 2024_02_19T18_09_52.338335
path:
- results_2024-02-19T18-09-52.338335.parquet
- split: latest
path:
- results_2024-02-19T18-09-52.338335.parquet
---
# Dataset Card for Evaluation run of yam-peleg/Experiment19-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment19-7B](https://huggingface.co/yam-peleg/Experiment19-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment19-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T18:09:52.338335](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment19-7B/blob/main/results_2024-02-19T18-09-52.338335.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6369774461420744,
"acc_stderr": 0.03249014878349776,
"acc_norm": 0.6366803430351798,
"acc_norm_stderr": 0.03316829176635558,
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7817594211835219,
"mc2_stderr": 0.013695289301759589
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068749,
"acc_norm": 0.7235494880546075,
"acc_norm_stderr": 0.013069662474252425
},
"harness|hellaswag|10": {
"acc": 0.7001593308105954,
"acc_stderr": 0.004572515919210699,
"acc_norm": 0.8860784704242183,
"acc_norm_stderr": 0.003170666122517656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469546,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469546
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464081,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464081
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865464,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457152,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457152
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7817594211835219,
"mc2_stderr": 0.013695289301759589
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433537
},
"harness|gsm8k|5": {
"acc": 0.6542835481425322,
"acc_stderr": 0.01310042299044157
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_76 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1270133204.0
num_examples: 249437
download_size: 1297440145
dataset_size: 1270133204.0
---
# Dataset Card for "chunk_76"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Smuggling1710/vERPv2 | ---
license: apache-2.0
---
|
strombergnlp/bajer_danish_misogyny | ---
annotations_creators:
- expert-generated
language_creators:
- found
language: da
license: other
multilinguality:
- monolingual
pretty_name: 'BAJER: Annotations for Misogyny'
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- hate-speech-detection
tags:
- not-for-all-audiences
extra_gated_prompt: "To receive a copy of the BAJER Dataset, the Researcher(s) must observe the restrictions listed below. In addition to other possible remedies, failure to observe these restrictions may result in revocation of permission to use the data as well as denial of access to additional material. By accessing this dataset you agrees to the following restrictions on the BAJER Dataset: **Purpose.** The Dataset will be used for research and/or statistical purposes only. **Redistribution** The Dataset, in whole or in part, will not be further distributed, published, copied, or disseminated in any way or form whatsoever, whether for profit or not. The Researcher(s) is solely liable for all claims, losses, damages, costs, fees, and expenses resulting from their disclosure of the data. **Modification and Commercial Use** The Dataset, in whole or in part, will not be modified or used for commercial purposes. The right granted herein is specifically for the internal research purposes of Researcher(s), and Researcher(s) shall not duplicate or use the disclosed Database or its contents either directly or indirectly for commercialization or any other direct for-profit purpose. **Storage** The Researcher(s) must ensure that the data is stored and processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures in accordance with the GDPR. **Disclaimers** The Database has been developed as part of research conducted at ITU Copenhagen. The Database is experimental in nature and is made available “as is” without obligation by ITU Copenhagen to provide accompanying services or support. The entire risk as to the quality and
performance of the Database is with Researcher(s). **Governing law and indemnification** This agreement is governed by Danish law. To the extent allowed by law, the Researcher(s) shall indemnify and hold harmless ITU against any and all claims, losses, damages, costs, fees, and expenses resulting from Researcher(s) possession and/or use of the Dataset."
extra_gated_fields:
Your name and title: text
Organisation name: text
Organisation / Researcher Address: text
Contact e-mail address: text
extra_gated_heading: "Acknowledge ITU clearance agreement for the BAJER Dataset to access the repository"
extra_gated_button_content: "Accept license"
---
# Dataset Card for "Bajer"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://stromberg.ai/publication/aom/](https://stromberg.ai/publication/aom/)
- **Repository:** [https://github.com/StrombergNLP/Online-Misogyny-in-Danish-Bajer](https://github.com/StrombergNLP/Online-Misogyny-in-Danish-Bajer)
- **Paper:** [https://aclanthology.org/2021.acl-long.247/](https://aclanthology.org/2021.acl-long.247/)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
- **Size of downloaded dataset files:** 7.29 MiB
- **Size of the generated dataset:** 6.57 MiB
- **Total amount of disk used:** 13.85 MiB
### Dataset Summary
This is a high-quality dataset of annotated posts sampled from social
media posts and annotated for misogyny. Danish language.
Online misogyny, a category of online abusive language, has serious and
harmful social consequences. Automatic detection of misogynistic language
online, while imperative, poses complicated challenges to both data
gathering, data annotation, and bias mitigation, as this type of data is
linguistically complex and diverse.
See the accompanying ACL paper [Annotating Online Misogyny](https://aclanthology.org/2021.acl-long.247/) for full details.
### Supported Tasks and Leaderboards
*
### Languages
Danish (`bcp47:da`)
## Dataset Structure
### Data Instances
#### Bajer
- **Size of downloaded dataset files:** 7.29 MiB
- **Size of the generated dataset:** 6.57 MiB
- **Total amount of disk used:** 13.85 MiB
An example of 'train' looks as follows.
```
{
'id': '0',
'dataset_id': '0',
'label_id': '0',
'text': 'Tilfældigt hva, din XXXXXXXXXX 🤬🤬🤬',
'sampling': 'keyword_twitter',
'subtask_A': 1,
'subtask_B': 0,
'subtask_C1': 3,
'subtask_C2': 6
}
```
### Data Fields
- `id`: a `string` feature, unique identifier in this dataset.
- `dataset_id`: a `string` feature, internal annotation identifier.
- `label_id`: a `string` feature, internal annotation sequence number.
- `text`: a `string` of the text that's annotated.
- `sampling`: a `string` describing which sampling technique surfaced this message
- `subtask_A`: is the text abusive `ABUS` or not `NOT`? `0: NOT, 1: ABUS`
- `subtask_B`: for abusive text, what's the target - individual `IND`, group `GRP`, other `OTH`, or untargeted `UNT`? `0: IND, 1: GRP, 2: OTH, 3: UNT, 4: not applicable`
- `subtask_C1`: for group-targeted abuse, what's the group - misogynistic `SEX`, other `OTH`, or racist `RAC`? `0: SEX, 1: OTH, 2: RAC, 3: not applicable`
- `subtask_C2`: for misogyny, is it neosexist `NEOSEX`, discrediting `DISCREDIT`, normative stereotyping `NOR`, benevolent sexism `AMBIVALENT`, dominance `DOMINANCE`, or harassment `HARASSMENT`? `0: NEOSEX, 1: DISCREDIT, 2: NOR, 3: AMBIVALENT, 4: DOMINANCE, 5: HARASSMENT, 6: not applicable`
### Data Splits
| name |train|
|---------|----:|
|bajer|27880 sentences|
## Dataset Creation
### Curation Rationale
The goal was to collect data for developing an annotation schema of online misogyny.
Random sampling of text often results in scarcity of examples of specifically misogynistic content (e.g. (Wulczyn et al., 2017;
Founta et al., 2018)). Therefore, we used the common alternative of collecting data by using predefined keywords with a potentially high search hit
(e.g. Waseem and Hovy (2016)), and identifying
relevant user-profiles (e.g. (Anzovino et al., 2018))
and related topics (e.g. (Kumar et al., 2018)).
We searched for keywords (specific slurs, hashtags), that are known to occur in sexist posts. These
were defined by previous work, a slur list from
Reddit, and from interviews and surveys of online
misogyny among women. We also searched for
broader terms like “sex” or “women”, which do
not appear exclusively in a misogynistic context,
for example in the topic search, where we gathered
relevant posts and their comments from the social
media pages of public media. A complete list of
keywords can be found in the appendix.
Social media provides a potentially biased, but
broad snapshot of online human discourse, with
plenty of language and behaviours represented. Following best practice guidelines (Vidgen and Derczynski, 2020), we sampled from a language for
which there are no existing annotations of the target
phenomenon: Danish.
Different social media platforms attract different user groups and can exhibit domain-specific
language (Karan and Snajder ˇ , 2018). Rather than
choosing one platform (existing misogyny datasets
are primarily based on Twitter and Reddit (Guest
et al., 2021)), we sampled from multiple platforms:
Statista (2020) shows that the platform where most
Danish users are present is Facebook, followed
by Twitter, YouTube, Instagram and lastly, Reddit.
The dataset was sampled from Twitter, Facebook
and Reddit posts as plain text.
### Source Data
#### Initial Data Collection and Normalization
The dataset was sampled from Twitter, Facebook
and Reddit posts as plain text. Data was gathered based on: keyword-based search (i.e. purposive sampling); topic-based search; and content from specific users.
#### Who are the source language producers?
Danish-speaking social media users
### Annotations
#### Annotation process
In annotating our dataset, we built on the MATTER
framework (Pustejovsky and Stubbs, 2012) and use
the variation presented by Finlayson and Erjavec
(2017) (the MALER framework), where the Train & Test stages are replaced by Leveraging of annotations for one’s particular goal, in our case the
creation of a comprehensive taxonomy.
We created a set of guidelines for the annotators.
The annotators were first asked to read the guidelines and individually annotate about 150 different
posts, after which there was a shared discussion.
After this pilot round, the volume of samples per annotator was increased and every sample labeled by
2-3 annotators. When instances were ‘flagged’ or
annotators disagreed on them, they were discussed
during weekly meetings, and misunderstandings
were resolved together with the external facilitator. After round three, when reaching 7k annotated
posts (Figure 2), we continued with independent
annotations maintaining a 15% instance overlap
between randomly picked annotator pairs.
Management of annotator disagreement is an important part of the process design. Disagreements
can be solved by majority voting (Davidson et al.,
2017; Wiegand et al., 2019), labeled as abuse if at
least one annotator has labeled it (Golbeck et al.,
2017) or by a third objective instance (Gao and
Huang, 2017). Most datasets use crowdsourcing
platforms or a few academic experts for annotation
(Vidgen and Derczynski, 2020). Inter-annotatoragreement (IAA) and classification performance
are established as two grounded evaluation measurements for annotation quality (Vidgen and Derczynski, 2020). Comparing the performance of amateur annotators (while providing guidelines) with
expert annotators for sexism and racism annotation,
Waseem (2016) show that the quality of amateur
annotators is competitive with expert annotations
when several amateurs agree. Facing the trade-off
between training annotators intensely and the number of involved annotators, we continued with the
trained annotators and group discussions/ individual revisions for flagged content and disagreements
(Section 5.4).
#### Who are the annotators?
---|---
Gender|6 female, 2 male (8 total)
Age:| 5 <30; 3 ≥30
Ethnicity:| 5 Danish: 1 Persian, 1 Arabic, 1 Polish
Study/occupation: | Linguistics (2); Health/Software Design; Ethnography/Digital Design; Communication/Psychology; Anthropology/Broadcast Moderator; Ethnography/Climate Change; Film Artist
### Personal and Sensitive Information
Usernames and PII were stripped during annotation process by skipping content containing these and eliding it from the final dataset
## Considerations for Using the Data
### Social Impact of Dataset
The data contains abusive language. It may be possible to identify original speakers based on the content, so the data is only available for research purposes under a restrictive license and conditions. We hope that identifying sexism can help moderators. There is a possibility that the content here could be used to generate misogyny in Danish, which would place women in Denmark in an even more hostile environment, and for this reason data access is restricted and tracked.
### Discussion of Biases
We have taken pains to mitigate as many biases as we were aware of in this work.
**Selection biases:** Selection biases for abusive
language can be seen in the sampling of text, for instance when using keyword search (Wiegand et al.,
2019), topic dependency (Ousidhoum et al., 2020), users (Wiegand et al., 2019), domain (Wiegand
et al., 2019), time (Florio et al., 2020) and lack of
linguistic variety (Vidgen and Derczynski, 2020).
**Label biases:** Label biases can be caused by, for
instance, non-representative annotator selection,
lack in training/domain expertise, preconceived
notions, or pre-held stereotypes. These biases are
treated in relation to abusive language datasets
by several sources, e.g. general sampling and
annotators biases (Waseem, 2016; Al Kuwatly
et al., 2020), biases towards minority identity
mentions based for example on gender or race
(Davidson et al., 2017; Dixon et al., 2018; Park
et al., 2018; Davidson et al., 2019), and political
annotator biases (Wich et al., 2020). Other qualitative biases comprise, for instance, demographic
bias, over-generalization, topic exposure as social
biases (Hovy and Spruit, 2016).
We applied several measures to mitigate biases
occurring through the annotation design and execution: First, we selected labels grounded in existing,
peer-reviewed research from more than one field.
Second, we aimed for diversity in annotator profiles
in terms of age, gender, dialect, and background.
Third, we recruited a facilitator with a background
in ethnographic studies and provided intense annotator training. Fourth, we engaged in weekly group
discussions, iteratively improving the codebook
and integrating edge cases. Fifth, the selection of
platforms from which we sampled data is based on
local user representation in Denmark, rather than
convenience. Sixth, diverse sampling methods for
data collection reduced selection biases.
### Other Known Limitations
The data is absolutely NOT a reasonable or in any way stratified sample of social media text, so class prevalence/balance here says nothing about incidences of these phenomena in the wild. That said, we hypothesis that the distribution of types of misogyny in this data (subtask C2) is roughly representative of how misogyny presents on the studied platforms.
## Additional Information
### Dataset Curators
The dataset is curated by the paper's authors and the ethnographer-led annotation team.
### Licensing Information
The data is licensed under a restrictive usage agreement. [Apply for access here](https://forms.gle/MPdV8FG8EUuS1MdS6)
### Citation Information
```
@inproceedings{zeinert-etal-2021-annotating,
title = "Annotating Online Misogyny",
author = "Zeinert, Philine and
Inie, Nanna and
Derczynski, Leon",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.247",
doi = "10.18653/v1/2021.acl-long.247",
pages = "3181--3197",
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
tyzhu/find_sent_before_sent_train_400_eval_40_random_permute_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 5045864.718168813
num_examples: 3754
- name: validation
num_bytes: 232610
num_examples: 200
download_size: 1204239
dataset_size: 5278474.718168813
---
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudubstack-english | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 103588
num_examples: 1570
- name: corpus
num_bytes: 18199570
num_examples: 40221
download_size: 11382247
dataset_size: 18303158
---
# Dataset Card for "cqudubstack-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DeepLearner101/ImageNetSubsetValidate | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4050613.0
num_examples: 96
download_size: 4049988
dataset_size: 4050613.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ImageNetSubsetValidate"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/flickr_humans_dim_128_30k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 879655939.0
num_examples: 30000
download_size: 878486771
dataset_size: 879655939.0
---
# Dataset Card for "flickr_humans_dim_128_30k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-muse256-muse512-wuerst-sdv15/44203dc9 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 192
num_examples: 10
download_size: 1374
dataset_size: 192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "44203dc9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bsd_ja_en | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
- ja
license:
- cc-by-nc-sa-4.0
multilinguality:
- translation
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: business-scene-dialogue
pretty_name: Business Scene Dialogue
tags:
- business-conversations-translation
dataset_info:
features:
- name: id
dtype: string
- name: tag
dtype: string
- name: title
dtype: string
- name: original_language
dtype: string
- name: 'no'
dtype: int32
- name: en_speaker
dtype: string
- name: ja_speaker
dtype: string
- name: en_sentence
dtype: string
- name: ja_sentence
dtype: string
splits:
- name: train
num_bytes: 4778291
num_examples: 20000
- name: test
num_bytes: 492986
num_examples: 2120
- name: validation
num_bytes: 477935
num_examples: 2051
download_size: 1843443
dataset_size: 5749212
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for Business Scene Dialogue
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://raw.githubusercontent.com/tsuruoka-lab/BSD/)
- **Repository:** [Github](https://raw.githubusercontent.com/tsuruoka-lab/BSD/)
- **Paper:** [Rikters et al., 2019](https://www.aclweb.org/anthology/D19-5204)
- **Leaderboard:**
- **Point of Contact:** Matīss Rikters
### Dataset Summary
This is the Business Scene Dialogue (BSD) dataset,
a Japanese-English parallel corpus containing written conversations
in various business scenarios.
The dataset was constructed in 3 steps:
1) selecting business scenes,
2) writing monolingual conversation scenarios according to the selected scenes, and
3) translating the scenarios into the other language.
Half of the monolingual scenarios were written in Japanese
and the other half were written in English.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English, Japanese.
## Dataset Structure
### Data Instances
Each instance contains a conversation identifier, a sentence number that indicates its
position within the conversation, speaker name in English and Japanese,
text in English and Japanese, original language, scene of the scenario (tag),
and title of the scenario (title).
```python
{
"id": "190315_E004_13",
"no": 14,
"speaker": "Mr. Sam Lee",
"ja_speaker": "サム リーさん",
"en_sentence": "Would you guys consider a different scheme?",
"ja_sentence": "別の事業案も考慮されますか?",
"original_language": "en",
"tag": "phone call",
"title": "Phone: Review spec and scheme"
}
```
### Data Fields
- id: dialogue identifier
- no: sentence pair number within a dialogue
- en_speaker: speaker name in English
- ja_speaker: speaker name in Japanese
- en_sentence: sentence in English
- ja_sentence: sentence in Japanese
- original_language: language in which monolingual scenario was written
- tag: scenario
- title: scenario title
### Data Splits
- There are a total of 24171 sentences / 808 business scenarios.
- Train: 20000 sentences / 670 scenarios
- Dev: 2051 sentences / 69 scenarios
- Test: 2120 sentences / 69 scenarios
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Dataset provided for research purposes only. Please check dataset license for additional information.
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This dataset was released under the Creative Commons Attribution-NonCommercial-ShareAlike (CC BY-NC-SA) license.
### Citation Information
```
@inproceedings{rikters-etal-2019-designing,
title = "Designing the Business Conversation Corpus",
author = "Rikters, Mat{\=\i}ss and
Ri, Ryokan and
Li, Tong and
Nakazawa, Toshiaki",
booktitle = "Proceedings of the 6th Workshop on Asian Translation",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D19-5204",
doi = "10.18653/v1/D19-5204",
pages = "54--61"
}
```
### Contributions
Thanks to [@j-chim](https://github.com/j-chim) for adding this dataset. |
open-llm-leaderboard/details_nlpguy__Westgate | ---
pretty_name: Evaluation run of nlpguy/Westgate
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/Westgate](https://huggingface.co/nlpguy/Westgate) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__Westgate\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T19:13:46.734414](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Westgate/blob/main/results_2024-02-04T19-13-46.734414.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.656811843796802,\n\
\ \"acc_stderr\": 0.031938703448252795,\n \"acc_norm\": 0.6560424316624701,\n\
\ \"acc_norm_stderr\": 0.032612557879319985,\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.017487432144711806,\n \"mc2\": 0.6258706346942783,\n\
\ \"mc2_stderr\": 0.015516885259749542\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537372\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n\
\ \"acc_stderr\": 0.004516215206715354,\n \"acc_norm\": 0.8813981278629756,\n\
\ \"acc_norm_stderr\": 0.0032265867834212897\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700476,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700476\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.0127569333828237,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.0127569333828237\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.017487432144711806,\n \"mc2\": 0.6258706346942783,\n\
\ \"mc2_stderr\": 0.015516885259749542\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.009834691297450127\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \
\ \"acc_stderr\": 0.012616300735519656\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/Westgate
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|arc:challenge|25_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|gsm8k|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hellaswag|10_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T19-13-46.734414.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- '**/details_harness|winogrande|5_2024-02-04T19-13-46.734414.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T19-13-46.734414.parquet'
- config_name: results
data_files:
- split: 2024_02_04T19_13_46.734414
path:
- results_2024-02-04T19-13-46.734414.parquet
- split: latest
path:
- results_2024-02-04T19-13-46.734414.parquet
---
# Dataset Card for Evaluation run of nlpguy/Westgate
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/Westgate](https://huggingface.co/nlpguy/Westgate) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__Westgate",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T19:13:46.734414](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Westgate/blob/main/results_2024-02-04T19-13-46.734414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.656811843796802,
"acc_stderr": 0.031938703448252795,
"acc_norm": 0.6560424316624701,
"acc_norm_stderr": 0.032612557879319985,
"mc1": 0.4785801713586291,
"mc1_stderr": 0.017487432144711806,
"mc2": 0.6258706346942783,
"mc2_stderr": 0.015516885259749542
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.013203196088537372
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715354,
"acc_norm": 0.8813981278629756,
"acc_norm_stderr": 0.0032265867834212897
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700476,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.01658868086453063,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.01658868086453063
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.0127569333828237,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.0127569333828237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4785801713586291,
"mc1_stderr": 0.017487432144711806,
"mc2": 0.6258706346942783,
"mc2_stderr": 0.015516885259749542
},
"harness|winogrande|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.009834691297450127
},
"harness|gsm8k|5": {
"acc": 0.7005307050796058,
"acc_stderr": 0.012616300735519656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joonhok-exo-ai/korean_law_case_codes | ---
license: openrail
language:
- ko
tags:
- legal
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [김준호](mailto:joonhok@smartfitnow.com)
### Dataset Summary
[사건별 부호문자의 부여에 관한 예규(재일 2003-1, 재판예규 제1769호)](https://glaw.scourt.go.kr/wsjo/gchick/sjo330.do?contId=3245922&q=%EC%82%AC%EA%B1%B4%EB%B3%84+%EB%B6%80%ED%98%B8%EB%AC%B8%EC%9E%90&nq=&w=total&pg=NaN#1696829627652)에서 규정한 전체 사건부호 데이터셋입니다.
## Additional Information
### Dataset Curators
김준호([링크드인](https://www.linkedin.com/in/joonho-kim/)): 이 데이터셋은 인공지능 법률 서비스를 만들고 있는 제가 직접 필요해서 만들게 되었습니다.
### Contributions
혹시 데이터 중 잘못된 부분을 발견하신 분은 [joonhok@smartfitnow.com](mailto:joonhok@smartfitnow.com)로 연락 주시면
확인 후 반영하겠습니다. |
TheFinAI/flare-es-efp | ---
dataset_info:
features:
- name: 'query:'
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
- name: query
dtype: string
splits:
- name: test
num_bytes: 66200
num_examples: 37
download_size: 43563
dataset_size: 66200
---
# Dataset Card for "flare-es-efp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/code_instructions_standardized_cluster_15_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6504226
num_examples: 7136
download_size: 2840518
dataset_size: 6504226
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_15_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ucekmez/OpenOrca-tr | ---
license: mit
language:
- tr
---
## Includes a part of OpenOrca dataset in Turkish language
The Subset of OpenOrca dataset in turkish language comprises 798350 pairs of questions and answers in Turkish,
predominantly translated from English using Google Translate.
Wherever possible, specific terminology and unique names were retained unchanged in the translation process.
Feel free to submit pull requests to enhance the quality of the dataset.
Contact: https://www.linkedin.com/in/ugur-cekmez/ |
santiagxf/spanish-marketing-tweets | ---
license: unlicense
---
|
collabora/whisper-hindi-preprocessed | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: input_length
dtype: float64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 668819734185.926
num_examples: 694964
- name: test
num_bytes: 2093443544.0
num_examples: 2179
download_size: 337918400583
dataset_size: 670913177729.926
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
DTU54DL/librispeech-augmentated-validation-prepared | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 3218361347.125
num_examples: 2703
download_size: 1286686337
dataset_size: 3218361347.125
---
# Dataset Card for "librispeech-augmentated-validation-prepared"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Writer__InstructPalmyra-20b | ---
pretty_name: Evaluation run of Writer/InstructPalmyra-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Writer/InstructPalmyra-20b](https://huggingface.co/Writer/InstructPalmyra-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__InstructPalmyra-20b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-12T18:44:54.114721](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__InstructPalmyra-20b/blob/main/results_2023-10-12T18-44-54.114721.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01929530201342282,\n\
\ \"em_stderr\": 0.0014087520774405866,\n \"f1\": 0.08355075503355748,\n\
\ \"f1_stderr\": 0.0019786281704718585,\n \"acc\": 0.33648760481410367,\n\
\ \"acc_stderr\": 0.008897385527705382\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.01929530201342282,\n \"em_stderr\": 0.0014087520774405866,\n\
\ \"f1\": 0.08355075503355748,\n \"f1_stderr\": 0.0019786281704718585\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \
\ \"acc_stderr\": 0.004365042953621808\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788956\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Writer/InstructPalmyra-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|arc:challenge|25_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T18_44_54.114721
path:
- '**/details_harness|drop|3_2023-10-12T18-44-54.114721.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-12T18-44-54.114721.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T18_44_54.114721
path:
- '**/details_harness|gsm8k|5_2023-10-12T18-44-54.114721.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-12T18-44-54.114721.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hellaswag|10_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T18_44_54.114721
path:
- '**/details_harness|winogrande|5_2023-10-12T18-44-54.114721.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-12T18-44-54.114721.parquet'
- config_name: results
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- results_2023-08-29T16:04:46.105936.parquet
- split: 2023_10_12T18_44_54.114721
path:
- results_2023-10-12T18-44-54.114721.parquet
- split: latest
path:
- results_2023-10-12T18-44-54.114721.parquet
---
# Dataset Card for Evaluation run of Writer/InstructPalmyra-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Writer/InstructPalmyra-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Writer/InstructPalmyra-20b](https://huggingface.co/Writer/InstructPalmyra-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__InstructPalmyra-20b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T18:44:54.114721](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__InstructPalmyra-20b/blob/main/results_2023-10-12T18-44-54.114721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.01929530201342282,
"em_stderr": 0.0014087520774405866,
"f1": 0.08355075503355748,
"f1_stderr": 0.0019786281704718585,
"acc": 0.33648760481410367,
"acc_stderr": 0.008897385527705382
},
"harness|drop|3": {
"em": 0.01929530201342282,
"em_stderr": 0.0014087520774405866,
"f1": 0.08355075503355748,
"f1_stderr": 0.0019786281704718585
},
"harness|gsm8k|5": {
"acc": 0.02577710386656558,
"acc_stderr": 0.004365042953621808
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788956
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SethGA/neocortex_grounded_23k | ---
language:
- en
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 40274528
num_examples: 23240
download_size: 17889564
dataset_size: 40274528
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-18000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1053492
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
damerajee/philo-embedding-datasets | ---
license: mit
---
|
1rsh/speech-qa-magahi-hi-karya | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 85441650.26107226
num_examples: 394
- name: test
num_bytes: 8903210.738927739
num_examples: 35
download_size: 90539947
dataset_size: 94344861.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Thefoodprocessor/ingredients_alternatives | ---
dataset_info:
features:
- name: id
dtype: int64
- name: recipe
dtype: string
- name: ingredients_alternatives
dtype: string
splits:
- name: train
num_bytes: 141623808
num_examples: 74465
download_size: 69175436
dataset_size: 141623808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BramDelisse/positive_scenarios | ---
license: apache-2.0
language:
- en
pretty_name: Book passages, and their positive counterpart
size_categories:
- 10K<n<100K
--- |
BlackKakapo/multitask-ro | ---
license: apache-2.0
multilinguality: monolingual
size_categories: 1M<n<5M
language: ro
task_categories:
- text2text-generation
- question-answering
- sentence-similarity
- text-classification
- translation
- summarization
---
## Dataset
### Train
| Dataset | Link | Rows | Task-specific prefix |
| ------ | ------ | ------ | ------ |
| **Paraphrase** | [Paraphrase](https://huggingface.co/datasets/BlackKakapo/paraphrase-ro) | 131951 | *paraphrase:* **string** |
| **Grammar** | [Grammar](https://huggingface.co/datasets/BlackKakapo/grammar-ro) | 1686054 | *grammar:* **string** |
| **Synonyms** | - | 14085 | *synonyms:* **word** |
| **Translate** | - | 999725 | *translate Romanian to English:* **string** |
| **Summarize** | [Summarize](https://huggingface.co/datasets/readerbench/ro-text-summarization) | 71999 | *summarize:* **string** |
| **Sentiment analysis** | [Sentiment analysis](https://huggingface.co/datasets/ro_sent) | 36498 | *sentiment analysis:* **string** |
| **STS** | [STS](https://huggingface.co/datasets/ro_sts) | 7499 | *sts:* **string** |
| **Offense analysis** | [Offense analysis](https://huggingface.co/datasets/readerbench/ro-fb-offense) | 3199 | *offense analysis:* **string** |
| **Gsm8k-ro** | [Gsm8k-ro](https://huggingface.co/datasets/BlackKakapo/gsm8k-ro) | 7474 | **string** |
| **Qasc-ro** | [Qasc-ro](https://huggingface.co/datasets/BlackKakapo/qasc-ro) | 8134 | **string** |
| **Recipes-ro** | [Recipes-ro](https://huggingface.co/datasets/BlackKakapo/recipes-ro) | 818 | 1. *Spune-mi reteta pentru* **string** 2. *Cum as putea face* **string** 3. *Spune-mi te rog cum as putea face* **string** |
| **Qaworld-ro** | [Qaworld-ro](https://huggingface.co/datasets/BlackKakapo/qaworld-ro) | 722659 | **string** |
| **News-ro** | - | 102369 | 1. *Genereaza o știre cu titlul dat si incepe-o astfel* **string** 2. *Scrie o știre cu denumirea asta si cu acest inceput* **string**|
| **Newsagro-ro** | - | 568 | 1. *Genereaza o știre cu titlul dat si incepe-o astfel* **string** 2. *Scrie o știre cu denumirea asta si cu acest inceput* **string**|
| **Instruction-dataset-ro** | [Instruction-dataset-ro](https://huggingface.co/datasets/BlackKakapo/instruction-dataset-ro) | 326 | **string**|
| **TOTAL** | [Multitask-ro](https://huggingface.co/datasets/BlackKakapo/multitask-ro) | **~3.792.698** | |
### Eval
| Dataset | Link | Rows | Task-specific prefix |
| ------ | ------ | ------ | ------ |
| **Paraphrase** | [Paraphrase](https://huggingface.co/datasets/BlackKakapo/paraphrase-ro) | 3540 | *paraphrase:* **string** |
| **Grammar** | [Grammar](https://huggingface.co/datasets/BlackKakapo/grammar-ro) | 200 | *grammar:* **string** |
| **Synonyms** | - | 318 | *synonyms:* **word** |
| **Translate** | [Translate](https://huggingface.co/datasets/opus100/viewer/en-ro/train) | 3271 | *translate Romanian to English:* **string** |
| **Summarize** | [Summarize](https://huggingface.co/datasets/readerbench/ro-text-summarization) | 449 | *summarize:* **string** |
| **Sentiment analysis** | [Sentiment analysis](https://huggingface.co/datasets/ro_sent) | 789 | *sentiment analysis:* **string** |
| **STS** | [STS](https://huggingface.co/datasets/ro_sts) | 1119 | *sts:* **string** |
| **Offense analysis** | [Offense analysis](https://huggingface.co/datasets/readerbench/ro-fb-offense) | 1251 | *offense analysis:* **string** |
| **Gsm8k-ro** | [Gsm8k-ro](https://huggingface.co/datasets/BlackKakapo/gsm8k-ro) | 1319 | **string** |
| **Qasc-ro** | [Qasc-ro](https://huggingface.co/datasets/BlackKakapo/qasc-ro) | 926 | **string** |
| **Recipes-ro** | [Recipes-ro](https://huggingface.co/datasets/BlackKakapo/recipes-ro) | 63 | 1. *Spune-mi reteta pentru* **string** 2. *Cum as putea face* **string** 3. *Spune-mi te rog cum as putea face* **string** |
| **Qaworld-ro** | [Qaworld-ro](https://huggingface.co/datasets/BlackKakapo/qaworld-ro) | 3350 | **string** |
| **News-ro** | - | 140 | 1. *Genereaza o știre cu titlul dat si incepe-o astfel* **string** 2. *Scrie o știre cu denumirea asta si cu acest inceput* **string**|
| **Newsagro-ro** | - | 112 | 1. *Genereaza o știre cu titlul dat si incepe-o astfel* **string** 2. *Scrie o știre cu denumirea asta si cu acest inceput* **string**|
| **TOTAL** | [Multitask-ro](https://huggingface.co/datasets/BlackKakapo/multitask-ro) | **16847** | |
[Original summarize]: <https://huggingface.co/datasets/readerbench/ro-text-summarization>
[Original sent]: <https://huggingface.co/datasets/ro_sent>
[Original sts]: <https://huggingface.co/datasets/ro_sts>
[Original offense]: <https://huggingface.co/datasets/readerbench/ro-fb-offense> |
ibranze/araproje_mmlu_tr_s5 | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 83804
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_s5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cjerzak/LinkOrgs | ---
license: mit
---
Data repository for:
Brian Libgober, Connor T. Jerzak. Linking Datasets on Organizations Using Half-a-Billion Open-Collaborated Records. ArXiv Preprint, 2023. https://arxiv.org/abs/2302.02533
```
@article{LJ-LinkOrgs,
title={Linking Datasets on Organizations Using Half-a-Billion Open-Collaborated Records},
author={Libgober, Brian and Connor T. Jerzak},
journal={ArXiv Preprint},
year={2023}
}
```
This repository contains large-scale training data for improving linkage of data on organizations. `NegMatches_mat.csv` and `NegMatches_mat_hold.csv` refer to millions of negative name matches examples derived from the LinkedIn network (see paper for details). `PosMatches_mat.csv` and `PosMatches_mat_hold.csv` refer to millions of positive name matches examples derived from the LinkedIn network (see paper for details).
Additionally, files with saved `*_bipartite` refer to the bipartite network representation of the LinkedIn network that we use for improving linkage. files with saved `*_bipartite` refer to the Markov network representation of the LinkedIn network that we use for improving linkage.
With any questions, don't hestitate to reach out to `connor.jerzak@gmail.com`.
|
Codec-SUPERB/Nsynth-test_unit | ---
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 26425359
num_examples: 4096
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 26425359
num_examples: 4096
- name: academicodec_hifi_24k_320d
num_bytes: 39532559
num_examples: 4096
- name: audiodec_24k_320d
num_bytes: 84162575
num_examples: 4096
- name: dac_16k
num_bytes: 78985231
num_examples: 4096
- name: dac_24k
num_bytes: 315242511
num_examples: 4096
- name: dac_44k
num_bytes: 102037519
num_examples: 4096
- name: encodec_24k_12bps
num_bytes: 157693967
num_examples: 4096
- name: encodec_24k_1_5bps
num_bytes: 19838991
num_examples: 4096
- name: encodec_24k_24bps
num_bytes: 315242511
num_examples: 4096
- name: encodec_24k_3bps
num_bytes: 39532559
num_examples: 4096
- name: encodec_24k_6bps
num_bytes: 78919695
num_examples: 4096
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 211433487
num_examples: 4096
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 211433487
num_examples: 4096
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 210384911
num_examples: 4096
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 105527311
num_examples: 4096
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 210384911
num_examples: 4096
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 105527311
num_examples: 4096
- name: speech_tokenizer_16k
num_bytes: 52705295
num_examples: 4096
download_size: 309928519
dataset_size: 2391435549
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
tiennv/mmarco-passage-en | ---
dataset_info:
features:
- name: query_id
dtype: int64
- name: query
dtype: string
- name: positive_id
dtype: int64
- name: positive
dtype: string
- name: negatives
sequence: string
splits:
- name: train
num_bytes: 9290250492
num_examples: 415936
download_size: 4889145151
dataset_size: 9290250492
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mmarco-passage-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
samaikya/faces | ---
license: other
---
|
polinaeterna/test_splits_order | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: test
num_bytes: 32
num_examples: 2
- name: train
num_bytes: 48
num_examples: 2
download_size: 1776
dataset_size: 80
---
# Dataset Card for "test_splits_order"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/068927a6 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1324
dataset_size: 182
---
# Dataset Card for "068927a6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Teywa/Test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 338808
num_examples: 200
download_size: 201257
dataset_size: 338808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fddemarco/pushshift-reddit-comments | ---
dataset_info:
features:
- name: author
dtype: string
- name: body
dtype: string
- name: controversiality
dtype: int64
- name: created_utc
dtype: int64
- name: link_id
dtype: string
- name: score
dtype: int64
- name: subreddit
dtype: string
- name: subreddit_id
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1920019700
num_examples: 15034827
download_size: 1920019700
dataset_size: 1920019700
---
# Dataset Card for "pushshift-reddit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
BeastyZ/LLM-Verified-Retrieval | ---
license: apache-2.0
language:
- en
configs:
- config_name: origin
data_files:
- split: test
path: "origin/*.json"
- config_name: summary-answer
data_files:
- split: test
path: "summary-answer/*.json"
---
# Dataset Card for LLM-Verified-Retrieval
There are three types of datasets, namely ASQA, QAMPARI, and ELI5. We provide their raw data and the data with summary and answer generated by the model.
## Dataset Information
- raw data: We put the raw data in the origin directory. You can also find them and get more information in the repo of [ALCE](https://github.com/princeton-nlp/ALCE).
- summary-answer data: We put the data with summary and answer generated by the model(gpt-3.5-turbo-0301)in the summary-answer directory. You can also generate your own summary or answer if you can access to gpt-3.5-turbo-0301.
For brevity, we only introduce the field we add in the summary-answer data. Please refer to [ALCE](https://github.com/princeton-nlp/ALCE) for additional information.
**ASQA**
- summary_use_sub: For each sample, we use its sub-questions to summarize each document it encompasses.
**QAMPARI**
- summary: For each sample, we use its question to summarize each document it encompasses.
**ELI5**
- answer: For each sample, we use its question and each document it encompasses to answer the question. |
automated-research-group/gpt2-winogrande_base | ---
dataset_info:
features:
- name: id
dtype: string
- name: response
dtype: string
- name: request
dtype: string
- name: input_perplexity
dtype: float64
- name: input_likelihood
dtype: float64
- name: output_perplexity
dtype: float64
- name: output_likelihood
dtype: float64
splits:
- name: validation
num_bytes: 357278
num_examples: 1267
download_size: 162691
dataset_size: 357278
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "gpt2-winogrande_base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
showchen/Kurisu | ---
license: cc-by-4.0
task_categories:
- text-generation
- text2text-generation
language:
- en
- zh
size_categories:
- 10K<n<100K
---
# ChatHaruhi
# Reviving Anime Character in Reality via Large Language Model
[]()
[]()
github repo: https://github.com/LC1332/Chat-Haruhi-Suzumiya
**Chat-Haruhi-Suzumiya**is a language model that imitates the tone, personality and storylines of characters like Haruhi Suzumiya,
<details>
<summary> The project was developed by Cheng Li, Ziang Leng, Chenxi Yan, Xiaoyang Feng, HaoSheng Wang, Junyi Shen, Hao Wang, Weishi Mi, Aria Fei, Song Yan, Linkang Zhan, Yaokai Jia, Pingyu Wu, and Haozhen Sun,etc. </summary>
This is an open source project and the members were recruited from open source communities like DataWhale.
Lulu Li( [Cheng Li@SenseTime](https://github.com/LC1332) )initiated the whole project and designed and implemented most of the features.
Ziang Leng( [Ziang Leng@SenseTime](https://blairleng.github.io) )designed and implemented the training, data generation and backend architecture for ChatHaruhi 1.0.
Chenxi Yan( [Chenxi Yan@Chengdu University of Information Technology](https://github.com/todochenxi) )implemented and maintained the backend for ChatHaruhi 1.0.
Junyi Shen( [Junyi Shen@Zhejiang University](https://github.com/J1shen) )implemented the training code and participated in generating the training dataset.
Hao Wang( [Hao Wang](https://github.com/wanghao07456) )collected script data for a TV series and participated in data augmentation.
Weishi Mi( [Weishi MI@Tsinghua University](https://github.com/hhhwmws0117) )participated in data augmentation.
Aria Fei( [Aria Fei@BJUT](https://ariafyy.github.io/) )implemented the ASR feature for the script tool and participated in the Openness-Aware Personality paper project.
Xiaoyang Feng( [Xiaoyang Feng@Nanjing Agricultural University](https://github.com/fengyunzaidushi) )integrated the script recognition tool and participated in the Openness-Aware Personality paper project.
Yue Leng ( [Song Yan](https://github.com/zealot52099) )Collected data from The Big Bang Theory. Implemented script format conversion.
scixing(HaoSheng Wang)( [HaoSheng Wang](https://github.com/ssccinng) ) implemented voiceprint recognition in the script tool and tts-vits speech synthesis.
Linkang Zhan( [JunityZhan@Case Western Reserve University](https://github.com/JunityZhan) ) collected Genshin Impact's system prompts and story data.
Yaokai Jia( [Yaokai Jia](https://github.com/KaiJiaBrother) )implemented the Vue frontend and practiced GPU extraction of Bert in a psychology project.
Pingyu Wu( [Pingyu Wu@Juncai Shuyun](https://github.com/wpydcr) )helped deploy the first version of the training code.
Haozhen Sun( [Haozhen Sun@Tianjin University] )plot the character figures for ChatHaruhi.
</details>
### Citation
Please cite the repo if you use the data or code in this repo.
```
@misc{li2023chatharuhi,
title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model},
author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun},
year={2023},
eprint={2308.09597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
IsaacJu666/pokemon | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: text_blip
dtype: string
splits:
- name: train
num_bytes: 56583875.0
num_examples: 833
download_size: 50947153
dataset_size: 56583875.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pokemon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
afg1/sentence-pair-contradictions | ---
license: unknown
task_categories:
- text-classification
language:
- en
size_categories:
- 10K<n<100K
---
A dataset of contradictory sentence pairs.
This is the dataset from a Master's thesis: https://repositorio-aberto.up.pt/bitstream/10216/129029/2/415679.pdf. The name of the dataset comes from the title of the thesis. You can find some documentation for the dataset in the following repository: https://github.com/BeatrizBaldaia/sentence-pair-contradictions/tree/master
Originally collated by Beatriz Souto de Sá Baldaia, I take no credit for that aspect, I just processed and uploaded it here because I needed to train a sentence contradiction model.
This dataset is a merge of several others, so I'm not sure what the license should be.
I think there is quite a lot of political content in the data, but models trained from it do seem to generalise |
joey234/mmlu-human_sexuality-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 6430
num_examples: 5
- name: test
num_bytes: 910938
num_examples: 131
download_size: 146756
dataset_size: 917368
---
# Dataset Card for "mmlu-human_sexuality-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__opt125m_10e2 | ---
pretty_name: Evaluation run of BFauber/opt125m_10e2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/opt125m_10e2](https://huggingface.co/BFauber/opt125m_10e2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T18:49:35.603979](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e2/blob/main/results_2024-02-02T18-49-35.603979.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2648565772911682,\n\
\ \"acc_stderr\": 0.031051795928190056,\n \"acc_norm\": 0.26575034223671873,\n\
\ \"acc_norm_stderr\": 0.03185766293174892,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.4259479561362201,\n\
\ \"mc2_stderr\": 0.015004242057206638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2090443686006826,\n \"acc_stderr\": 0.011882746987406453,\n\
\ \"acc_norm\": 0.23208191126279865,\n \"acc_norm_stderr\": 0.012336718284948854\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29087831109340767,\n\
\ \"acc_stderr\": 0.004532393111248685,\n \"acc_norm\": 0.3140808603863772,\n\
\ \"acc_norm_stderr\": 0.004632001732332982\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241238,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.031584153240477086,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.031584153240477086\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.02407869658063547,\n \
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.02407869658063547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303531,\n\
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303531\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19730941704035873,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.19730941704035873,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.038342410214190735,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.038342410214190735\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.210727969348659,\n\
\ \"acc_stderr\": 0.014583812465862546,\n \"acc_norm\": 0.210727969348659,\n\
\ \"acc_norm_stderr\": 0.014583812465862546\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098407,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098407\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.02525117393649502,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.02525117393649502\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460994,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460994\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2529335071707953,\n\
\ \"acc_stderr\": 0.011102268713839989,\n \"acc_norm\": 0.2529335071707953,\n\
\ \"acc_norm_stderr\": 0.011102268713839989\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.03175554786629921,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.03175554786629921\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.4259479561362201,\n\
\ \"mc2_stderr\": 0.015004242057206638\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5217048145224941,\n \"acc_stderr\": 0.01403923921648463\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492673\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/opt125m_10e2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-49-35.603979.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- '**/details_harness|winogrande|5_2024-02-02T18-49-35.603979.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T18-49-35.603979.parquet'
- config_name: results
data_files:
- split: 2024_02_02T18_49_35.603979
path:
- results_2024-02-02T18-49-35.603979.parquet
- split: latest
path:
- results_2024-02-02T18-49-35.603979.parquet
---
# Dataset Card for Evaluation run of BFauber/opt125m_10e2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e2](https://huggingface.co/BFauber/opt125m_10e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:49:35.603979](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e2/blob/main/results_2024-02-02T18-49-35.603979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2648565772911682,
"acc_stderr": 0.031051795928190056,
"acc_norm": 0.26575034223671873,
"acc_norm_stderr": 0.03185766293174892,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.4259479561362201,
"mc2_stderr": 0.015004242057206638
},
"harness|arc:challenge|25": {
"acc": 0.2090443686006826,
"acc_stderr": 0.011882746987406453,
"acc_norm": 0.23208191126279865,
"acc_norm_stderr": 0.012336718284948854
},
"harness|hellaswag|10": {
"acc": 0.29087831109340767,
"acc_stderr": 0.004532393111248685,
"acc_norm": 0.3140808603863772,
"acc_norm_stderr": 0.004632001732332982
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241238,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102146,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.031584153240477086,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.031584153240477086
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303531,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19730941704035873,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.19730941704035873,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190735,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190735
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.210727969348659,
"acc_stderr": 0.014583812465862546,
"acc_norm": 0.210727969348659,
"acc_norm_stderr": 0.014583812465862546
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098407,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.02525117393649502,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.02525117393649502
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460994,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2529335071707953,
"acc_stderr": 0.011102268713839989,
"acc_norm": 0.2529335071707953,
"acc_norm_stderr": 0.011102268713839989
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629921,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629921
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.4259479561362201,
"mc2_stderr": 0.015004242057206638
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.01403923921648463
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492673
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
beratcmn/instruction-turkish-poems | ---
license: apache-2.0
language:
- tr
size_categories:
- 1K<n<10K
---
# Turkish poems for fine-tuning LLMs with instructions.
Instructions created with **Google's Gemini-Pro**.
For a dataset that has variety of instructions check: beratcmn/rephrased-instruction-turkish-poems
Base dataset:
beratcmn/turkish-poems-cleaned
|
CyberHarem/himuro_kane_fatestaynightufotable | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Himuro Kane (Fate Stay Night [UFOTABLE])
This is the dataset of Himuro Kane (Fate Stay Night [UFOTABLE]), containing 11 images and their tags.
The core tags of this character are `glasses, long_hair, brown_eyes, brown_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 5.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 5.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 9.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 5.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 10.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/himuro_kane_fatestaynightufotable',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, homurahara_academy_school_uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | homurahara_academy_school_uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------------------------------|
| 0 | 11 |  |  |  |  |  | X | X | X |
|
pytorch-survival/rotterdam_gbsg | ---
dataset_info:
features:
- name: x0
dtype: float32
- name: x1
dtype: float32
- name: x2
dtype: float32
- name: x3
dtype: float32
- name: x4
dtype: float32
- name: x5
dtype: float32
- name: x6
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: int32
splits:
- name: train
num_bytes: 80352
num_examples: 2232
download_size: 34711
dataset_size: 80352
---
# Dataset Card for "rotterdam_gbsg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nvsai/ElectronicDevices | ---
license: mit
---
|
open-llm-leaderboard/details_Yash21__TinyYi-7B-Test | ---
pretty_name: Evaluation run of Yash21/TinyYi-7b-Test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yash21/TinyYi-7b-Test](https://huggingface.co/Yash21/TinyYi-7b-Test) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yash21__TinyYi-7b-Test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T09:37:46.162753](https://huggingface.co/datasets/open-llm-leaderboard/details_Yash21__TinyYi-7b-Test/blob/main/results_2024-01-06T09-37-46.162753.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2443925509452191,\n\
\ \"acc_stderr\": 0.030445918316181916,\n \"acc_norm\": 0.24480573905020522,\n\
\ \"acc_norm_stderr\": 0.03125145025399728,\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059678,\n \"mc2\": 0.4634983243757816,\n\
\ \"mc2_stderr\": 0.01640558930232759\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23037542662116042,\n \"acc_stderr\": 0.01230492841874761,\n\
\ \"acc_norm\": 0.2687713310580205,\n \"acc_norm_stderr\": 0.012955065963710686\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2551284604660426,\n\
\ \"acc_stderr\": 0.004350424750646203,\n \"acc_norm\": 0.2614021111332404,\n\
\ \"acc_norm_stderr\": 0.004385004998923463\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.031546980450822305,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.031546980450822305\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111836,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111836\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.02732107841738753,\n\
\ \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.02732107841738753\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243183,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243183\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178253,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178253\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361286,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780305,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780305\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.02746740180405799,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02746740180405799\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.02920254015343117,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.02920254015343117\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\
\ \"acc_stderr\": 0.016246087069701393,\n \"acc_norm\": 0.29118773946360155,\n\
\ \"acc_norm_stderr\": 0.016246087069701393\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n\
\ \"acc_stderr\": 0.023839303311398215,\n \"acc_norm\": 0.2282958199356913,\n\
\ \"acc_norm_stderr\": 0.023839303311398215\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n\
\ \"acc_stderr\": 0.010906282617981634,\n \"acc_norm\": 0.23989569752281617,\n\
\ \"acc_norm_stderr\": 0.010906282617981634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27941176470588236,\n \"acc_stderr\": 0.01815287105153881,\n \
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.01815287105153881\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355547,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355547\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059678,\n \"mc2\": 0.4634983243757816,\n\
\ \"mc2_stderr\": 0.01640558930232759\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5090765588003157,\n \"acc_stderr\": 0.014050170094497697\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Yash21/TinyYi-7b-Test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|arc:challenge|25_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|arc:challenge|25_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|gsm8k|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|gsm8k|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hellaswag|10_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hellaswag|10_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T07-43-40.305150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T09-37-46.162753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T09-37-46.162753.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- '**/details_harness|winogrande|5_2024-01-06T07-43-40.305150.parquet'
- split: 2024_01_06T09_37_46.162753
path:
- '**/details_harness|winogrande|5_2024-01-06T09-37-46.162753.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T09-37-46.162753.parquet'
- config_name: results
data_files:
- split: 2024_01_06T07_43_40.305150
path:
- results_2024-01-06T07-43-40.305150.parquet
- split: 2024_01_06T09_37_46.162753
path:
- results_2024-01-06T09-37-46.162753.parquet
- split: latest
path:
- results_2024-01-06T09-37-46.162753.parquet
---
# Dataset Card for Evaluation run of Yash21/TinyYi-7b-Test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yash21/TinyYi-7b-Test](https://huggingface.co/Yash21/TinyYi-7b-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yash21__TinyYi-7b-Test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T09:37:46.162753](https://huggingface.co/datasets/open-llm-leaderboard/details_Yash21__TinyYi-7b-Test/blob/main/results_2024-01-06T09-37-46.162753.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2443925509452191,
"acc_stderr": 0.030445918316181916,
"acc_norm": 0.24480573905020522,
"acc_norm_stderr": 0.03125145025399728,
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059678,
"mc2": 0.4634983243757816,
"mc2_stderr": 0.01640558930232759
},
"harness|arc:challenge|25": {
"acc": 0.23037542662116042,
"acc_stderr": 0.01230492841874761,
"acc_norm": 0.2687713310580205,
"acc_norm_stderr": 0.012955065963710686
},
"harness|hellaswag|10": {
"acc": 0.2551284604660426,
"acc_stderr": 0.004350424750646203,
"acc_norm": 0.2614021111332404,
"acc_norm_stderr": 0.004385004998923463
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.031546980450822305,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.031546980450822305
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111836,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111836
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.02732107841738753,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.02732107841738753
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243183,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243183
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178253,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178253
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361286,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780305,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780305
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02746740180405799,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02746740180405799
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2066115702479339,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.2066115702479339,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.02920254015343117,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.02920254015343117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29118773946360155,
"acc_stderr": 0.016246087069701393,
"acc_norm": 0.29118773946360155,
"acc_norm_stderr": 0.016246087069701393
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.023839303311398215,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.023839303311398215
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981634,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.01815287105153881,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.01815287105153881
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355547,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355547
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059678,
"mc2": 0.4634983243757816,
"mc2_stderr": 0.01640558930232759
},
"harness|winogrande|5": {
"acc": 0.5090765588003157,
"acc_stderr": 0.014050170094497697
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lucadiliello/STORIES | ---
license: cc
language:
- en
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 34099206982
num_examples: 945354
- name: dev
num_bytes: 41804891
num_examples: 946
- name: test
num_bytes: 42356443
num_examples: 947
download_size: 15347401118
dataset_size: 34183368316
task_categories:
- fill-mask
- text-generation
pretty_name: STORIES
size_categories:
- 100K<n<1M
---
Original STORIES dataset from the paper [A Simple Method for Commonsense Reasoning](https://arxiv.org/pdf/1806.02847v2.pdf). |
DJ491/Danti | ---
license: cc-by-nc-4.0
---
|
shivam9980/bhojpuri-news | ---
license: apache-2.0
---
|
twang2218/chinese-law-and-regulations | ---
license: apache-2.0
dataset_info:
- config_name: default
features:
- name: publish_date
dtype: timestamp[ns]
- name: effective_date
dtype: timestamp[ns]
- name: type
dtype: string
- name: status
dtype: string
- name: title
dtype: string
- name: office
dtype: string
- name: office_level
dtype: string
- name: office_category
dtype: string
- name: effective_period
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 363619544
num_examples: 22552
download_size: 159516785
dataset_size: 363619544
- config_name: metadata
features:
- name: publish_date
dtype: timestamp[ns]
- name: effective_date
dtype: timestamp[ns]
- name: type
dtype: string
- name: status
dtype: string
- name: title
dtype: string
- name: office
dtype: string
- name: office_level
dtype: string
- name: office_category
dtype: string
- name: effective_period
dtype: string
splits:
- name: train
num_bytes: 4529871
num_examples: 22552
download_size: 740438
dataset_size: 4529871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: metadata
data_files:
- split: train
path: metadata/train-*
---
|
autoevaluate/autoeval-staging-eval-project-emotion-f650c475-9895316 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: bhadresh-savani/distilbert-base-uncased-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: bhadresh-savani/distilbert-base-uncased-emotion
* Dataset: emotion
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@bhadresh-savani](https://huggingface.co/bhadresh-savani) for evaluating this model. |
jose-datamaran/targets_ghg | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_target
'1': target
splits:
- name: train
num_bytes: 139050.63525091799
num_examples: 653
- name: test
num_bytes: 34922.36474908201
num_examples: 164
download_size: 97996
dataset_size: 173973.0
---
# Dataset Card for "targets_ghg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/98e1bdf3 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1337
dataset_size: 188
---
# Dataset Card for "98e1bdf3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Santp98/ranking_options_processes | ---
dataset_info:
features:
- name: index
dtype: int64
- name: process_id
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 5619635
num_examples: 23323
download_size: 3091438
dataset_size: 5619635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ranking_options_processes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TaJ001/SampleLlama2_7b_data | ---
language:
- en
- fr
- de
task_categories:
- text2text-generation
- question-answering
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anan-2024/twitter_dataset_1713214367 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 60386
num_examples: 176
download_size: 38485
dataset_size: 60386
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/TinyImagenet_800_validation_google_flan_t5_xxl_mode_A_ns_800 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 316228
num_examples: 800
download_size: 97772
dataset_size: 316228
---
# Dataset Card for "TinyImagenet_800_validation_google_flan_t5_xxl_mode_A_ns_800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pseudolab/huggingface-krew-hackathon2023 | ---
license: cc-by-4.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.