datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
nikitam/ACES | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
source_datasets:
- FLORES-101, FLORES-200, PAWS-X, XNLI, XTREME, WinoMT, Wino-X, MuCOW, EuroParl ConDisco, ParcorFull
task_categories:
- translation
pretty_name: ACES
configs:
- config_name: ACES
data_files: challenge_set.jsonl
- config_name: Span-ACES
data_files: span_aces.jsonl
---
# Dataset Card for ACES and Span-ACES
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Discussion of Biases](#discussion-of-biases)
- [Usage](#usage)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contact](#contact)
## Dataset Description
- **Repository:** [ACES dataset repository](https://github.com/EdinburghNLP/ACES)
- **Paper:** [arXiv](https://arxiv.org/abs/2401.16313)
### Dataset Summary
ACES consists of 36,476 examples covering 146 language pairs and representing challenges from 68 phenomena for evaluating machine translation metrics. We focus on translation accuracy errors and base the phenomena covered in our challenge set on the Multidimensional Quality Metrics (MQM) ontology. The phenomena range from simple perturbations at the word/character level to more complex errors based on discourse and real-world knowledge.
29.01.2024: We also release Span-ACES, which is an extension to the ACES dataset. The errors in incorrect-translation are explicitly marked in a <v>span</v> format.
### Supported Tasks and Leaderboards
-Machine translation evaluation of metrics
-Potentially useful for contrastive machine translation evaluation
### Languages
The dataset covers 146 language pairs as follows:
af-en, af-fa, ar-en, ar-fr, ar-hi, be-en, bg-en, bg-lt, ca-en, ca-es, cs-en, da-en, de-en, de-es, de-fr, de-ja, de-ko, de-ru, de-zh, el-en, en-af, en-ar, en-be, en-bg, en-ca, en-cs, en-da, en-de, en-el, en-es, en-et, en-fa, en-fi, en-fr, en-gl, en-he, en-hi, en-hr, en-hu, en-hy, en-id, en-it, en-ja, en-ko, en-lt, en-lv, en-mr, en-nl, en-no, en-pl, en-pt, en-ro, en-ru, en-sk, en-sl, en-sr, en-sv, en-ta, en-tr, en-uk, en-ur, en-vi, en-zh, es-ca, es-de, es-en, es-fr, es-ja, es-ko, es-zh, et-en, fa-af, fa-en, fi-en, fr-de, fr-en, fr-es, fr-ja, fr-ko, fr-mr, fr-ru, fr-zh, ga-en, gl-en, he-en, he-sv, hi-ar, hi-en, hr-en, hr-lv, hu-en, hy-en, hy-vi, id-en, it-en, ja-de, ja-en, ja-es, ja-fr, ja-ko, ja-zh, ko-de, ko-en, ko-es, ko-fr, ko-ja, ko-zh, lt-bg, lt-en, lv-en, lv-hr, mr-en, nl-en, no-en, pl-en, pl-mr, pl-sk, pt-en, pt-sr, ro-en, ru-de, ru-en, ru-es, ru-fr, sk-en, sk-pl, sl-en, sr-en, sr-pt, sv-en, sv-he, sw-en, ta-en, th-en, tr-en, uk-en, ur-en, vi-en, vi-hy, wo-en, zh-de, zh-en, zh-es, zh-fr, zh-ja, zh-ko
## Dataset Structure
### Data Instances
Each data instance contains the following features: _source_, _good-translation_, _incorrect-translation_, _reference_, _phenomena_, _langpair_
See the [ACES corpus viewer](https://huggingface.co/datasets/nikitam/ACES/viewer/nikitam--ACES/train) to explore more examples.
An example from the ACES challenge set looks like the following:
```
{'source': "Proper nutritional practices alone cannot generate elite performances, but they can significantly affect athletes' overall wellness.", 'good-translation': 'Las prácticas nutricionales adecuadas por sí solas no pueden generar rendimiento de élite, pero pueden afectar significativamente el bienestar general de los atletas.', 'incorrect-translation': 'Las prácticas nutricionales adecuadas por sí solas no pueden generar rendimiento de élite, pero pueden afectar significativamente el bienestar general de los jóvenes atletas.', 'reference': 'No es posible que las prácticas nutricionales adecuadas, por sí solas, generen un rendimiento de elite, pero puede influir en gran medida el bienestar general de los atletas .', 'phenomena': 'addition', 'langpair': 'en-es'}
```
An example from the Span-ACES challenge set looks like the following:
```
{'source': "Proper nutritional practices alone cannot generate elite performances, but they can significantly affect athletes' overall wellness.", 'good-translation': 'Las prácticas nutricionales adecuadas por sí solas no pueden generar rendimiento de élite, pero pueden afectar significativamente el bienestar general de los atletas.', 'incorrect-translation': 'Las prácticas nutricionales adecuadas por sí solas no pueden generar rendimiento de élite, pero pueden afectar significativamente el bienestar general de los jóvenes atletas.', 'reference': 'No es posible que las prácticas nutricionales adecuadas, por sí solas, generen un rendimiento de elite, pero puede influir en gran medida el bienestar general de los atletas .', 'phenomena': 'addition', 'langpair': 'en-es', "incorrect-translation-annotated":"Las prácticas nutricionales adecuadas por sí solas no pueden generar rendimiento de élite, pero pueden afectar significativamente el bienestar general de los <v>jóvenes</v> atletas.","annotation-method":"annotate_word"}
```
### Data Fields
- 'source': a string containing the text that needs to be translated
- 'good-translation': possible translation of the source sentence
- 'incorrect-translation': translation of the source sentence that contains an error or phenomenon of interest
- 'reference': the gold standard translation
- 'phenomena': the type of error or phenomena being studied in the example
- 'langpair': the source language and the target language pair of the example
- 'incorrect-translation-annotated': incorrect translation with annotated spans containing the phenomena
- 'annotation-method': field describing how the annotation
Note that the _good-translation_ may not be free of errors but it is a better translation than the _incorrect-translation_
### Data Splits
The ACES dataset has 1 split: _train_ which contains the challenge set. There are 36476 examples.
Note, the examples in Span-ACES are identical to ACES with the two additional columns. The examples are also stored under a different _train_ split
## Dataset Creation
### Curation Rationale
With the advent of neural networks and especially Transformer-based architectures, machine translation outputs have become more and more fluent. Fluency errors are also judged less severely than accuracy errors by human evaluators \citep{freitag-etal-2021-experts} which reflects the fact that accuracy errors can have dangerous consequences in certain contexts, for example in the medical and legal domains. For these reasons, we decided to build a challenge set focused on accuracy errors.
Another aspect we focus on is including a broad range of language pairs in ACES. Whenever possible we create examples for all language pairs covered in a source dataset when we use automatic approaches. For phenomena where we create examples manually, we also aim to cover at least two language pairs per phenomenon but are of course limited to the languages spoken by the authors.
We aim to offer a collection of challenge sets covering both easy and hard phenomena. While it may be of interest to the community to continuously test on harder examples to check where machine translation evaluation metrics still break, we believe that easy challenge sets are just as important to ensure that metrics do not suddenly become worse at identifying error types that were previously considered ``solved''. Therefore, we take a holistic view when creating ACES and do not filter out individual examples or exclude challenge sets based on baseline metric performance or other factors.
### Source Data
#### Initial Data Collection and Normalization
Please see Sections 4 and 5 of the paper.
#### Who are the source language producers?
The dataset contains sentences found in FLORES-101, FLORES-200, PAWS-X, XNLI, XTREME, WinoMT, Wino-X, MuCOW, EuroParl ConDisco, ParcorFull datasets. Please refer to the respective papers for further details.
### Personal and Sensitive Information
The external datasets may contain sensitive information. Refer to the respective datasets for further details.
## Considerations for Using the Data
### Usage
ACES has been primarily designed to evaluate machine translation metrics on the accuracy errors. We expect the metric to score _good-translation_ consistently higher than _incorrect-translation_. We report the performance of metric based on Kendall-tau like correlation. It measures the number of times a metric scores the good translation above the incorrect translation (concordant) and equal to or lower than the incorrect translation (discordant).
### Discussion of Biases
Some examples within the challenge set exhibit biases, however, this is necessary in order to expose the limitations of existing metrics.
### Other Known Limitations
The ACES challenge set exhibits a number of biases. Firstly, there is greater coverage in terms of phenomena and the number of examples for the en-de and en-fr language pairs. This is in part due to the manual effort required to construct examples for some phenomena, in particular, those belonging to the discourse-level and real-world knowledge categories. Further, our choice of language pairs is also limited to the ones available in XLM-R. Secondly, ACES contains more examples for those phenomena for which examples could be generated automatically, compared to those that required manual construction/filtering. Thirdly, some of the automatically generated examples require external libraries which are only available for a few languages (e.g. Multilingual Wordnet). Fourthly, the focus of the challenge set is on accuracy errors. We leave the development of challenge sets for fluency errors to future work.
As a result of using existing datasets as the basis for many of the examples, errors present in these datasets may be propagated through into ACES. Whilst we acknowledge that this is undesirable, in our methods for constructing the incorrect translation we aim to ensure that the quality of the incorrect translation is always worse than the corresponding good translation.
The results and analyses presented in the paper exclude those metrics submitted to the WMT 2022 metrics shared task that provides only system-level outputs. We focus on metrics that provide segment-level outputs as this enables us to provide a broad overview of metric performance on different phenomenon categories and to conduct fine-grained analyses of performance on individual phenomena. For some of the fine-grained analyses, we apply additional constraints based on the language pairs covered by the metrics, or whether the metrics take the source as input, to address specific questions of interest. As a result of applying some of these additional constraints, our investigations tend to focus more on high and medium-resource languages than on low-resource languages. We hope to address this shortcoming in future work.
## Additional Information
### Licensing Information
The ACES dataset is Creative Commons Attribution Non-Commercial Share Alike 4.0 (cc-by-nc-sa-4.0)
### Citation Information
```
@inproceedings{amrhein-etal-2022-aces,
title = "{ACES}: Translation Accuracy Challenge Sets for Evaluating Machine Translation Metrics",
author = "Amrhein, Chantal and
Moghe, Nikita and
Guillou, Liane",
booktitle = "Proceedings of the Seventh Conference on Machine Translation (WMT)",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates (Hybrid)",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.wmt-1.44",
pages = "479--513",
}
```
If using Span-ACES,
```
@misc{moghe2024machine,
title={Machine Translation Meta Evaluation through Translation Accuracy Challenge Sets},
author={Nikita Moghe and Arnisa Fazla and Chantal Amrhein and Tom Kocmi and Mark Steedman and Alexandra Birch and Rico Sennrich and Liane Guillou},
year={2024},
eprint={2401.16313},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contact
[Chantal Amrhein](mailto:amrhein@cl.uzh.ch) and [Nikita Moghe](mailto:nikita.moghe@ed.ac.uk) and [Liane Guillou](mailto:lguillou@ed.ac.uk)
Dataset card based on [Allociné](https://huggingface.co/datasets/allocine) |
tollefj/sts14-sts-NOB | ---
license: cc-by-4.0
---
# Translated STS dataset to Norwegian Bokmål
Machine translated using the *No language left behind* model series, specifically the 1.3B variant: https://huggingface.co/facebook/nllb-200-distilled-1.3B |
HydraLM/partitioned_v2_standardized_08 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
splits:
- name: train
num_bytes: 63484239.644193426
num_examples: 124197
download_size: 10735759
dataset_size: 63484239.644193426
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_08"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nam194/codesum_java_512_128_function | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: repo
dtype: string
- name: path
dtype: string
- name: license
sequence: string
- name: language
dtype: string
- name: identifier
dtype: string
- name: return_type
dtype: string
- name: original_string
dtype: string
- name: original_docstring
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: short_docstring
dtype: string
- name: short_docstring_tokens
sequence: string
- name: comment
sequence: string
- name: parameters
list:
- name: param
dtype: string
- name: type
dtype: string
- name: docstring_params
struct:
- name: returns
list:
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: type
dtype: string
- name: raises
list:
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: type
dtype: string
- name: params
list:
- name: identifier
dtype: string
- name: type
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: default
dtype: string
- name: is_optional
dtype: bool
- name: outlier_params
list:
- name: identifier
dtype: string
- name: type
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: default
dtype: string
- name: is_optional
dtype: bool
- name: others
list:
- name: identifier
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 49797821738
num_examples: 6629193
download_size: 9683784722
dataset_size: 49797821738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Margaret-mmh/mini-MedQuad | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 4089845
num_examples: 1000
download_size: 1830455
dataset_size: 4089845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T01:55:24.069120](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o/blob/main/results_2023-10-25T01-55-24.069120.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3683934563758389,\n\
\ \"em_stderr\": 0.004939908621291744,\n \"f1\": 0.40490771812080534,\n\
\ \"f1_stderr\": 0.004849475843152754,\n \"acc\": 0.45406011226844856,\n\
\ \"acc_stderr\": 0.010767859275955907\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3683934563758389,\n \"em_stderr\": 0.004939908621291744,\n\
\ \"f1\": 0.40490771812080534,\n \"f1_stderr\": 0.004849475843152754\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1425322213798332,\n \
\ \"acc_stderr\": 0.009629588445673827\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T01_55_24.069120
path:
- '**/details_harness|drop|3_2023-10-25T01-55-24.069120.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T01-55-24.069120.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T01_55_24.069120
path:
- '**/details_harness|gsm8k|5_2023-10-25T01-55-24.069120.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T01-55-24.069120.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T01_55_24.069120
path:
- '**/details_harness|winogrande|5_2023-10-25T01-55-24.069120.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T01-55-24.069120.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- results_2023-10-01T14-32-44.417888.parquet
- split: 2023_10_25T01_55_24.069120
path:
- results_2023-10-25T01-55-24.069120.parquet
- split: latest
path:
- results_2023-10-25T01-55-24.069120.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T01:55:24.069120](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o/blob/main/results_2023-10-25T01-55-24.069120.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3683934563758389,
"em_stderr": 0.004939908621291744,
"f1": 0.40490771812080534,
"f1_stderr": 0.004849475843152754,
"acc": 0.45406011226844856,
"acc_stderr": 0.010767859275955907
},
"harness|drop|3": {
"em": 0.3683934563758389,
"em_stderr": 0.004939908621291744,
"f1": 0.40490771812080534,
"f1_stderr": 0.004849475843152754
},
"harness|gsm8k|5": {
"acc": 0.1425322213798332,
"acc_stderr": 0.009629588445673827
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3 | ---
pretty_name: Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T04:40:32.614362](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3/blob/main/results_2024-01-20T04-40-32.614362.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144035496773548,\n\
\ \"acc_stderr\": 0.032858739117399755,\n \"acc_norm\": 0.6200519616024565,\n\
\ \"acc_norm_stderr\": 0.03352475225298005,\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n\
\ \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491887,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.676458872734515,\n\
\ \"acc_stderr\": 0.0046687106891924,\n \"acc_norm\": 0.8584943238398726,\n\
\ \"acc_norm_stderr\": 0.0034783009945146973\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"\
acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"\
acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709447,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917212,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917212\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865464,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.0196438015579248,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.0196438015579248\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5789464689775264,\n\
\ \"mc2_stderr\": 0.015807009741465705\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3419257012888552,\n \
\ \"acc_stderr\": 0.0130660896251828\n }\n}\n```"
repo_url: https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|arc:challenge|25_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|gsm8k|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hellaswag|10_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T04-40-32.614362.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- '**/details_harness|winogrande|5_2024-01-20T04-40-32.614362.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T04-40-32.614362.parquet'
- config_name: results
data_files:
- split: 2024_01_20T04_40_32.614362
path:
- results_2024-01-20T04-40-32.614362.parquet
- split: latest
path:
- results_2024-01-20T04-40-32.614362.parquet
---
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T04:40:32.614362](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3/blob/main/results_2024-01-20T04-40-32.614362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144035496773548,
"acc_stderr": 0.032858739117399755,
"acc_norm": 0.6200519616024565,
"acc_norm_stderr": 0.03352475225298005,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491887,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.676458872734515,
"acc_stderr": 0.0046687106891924,
"acc_norm": 0.8584943238398726,
"acc_norm_stderr": 0.0034783009945146973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709447,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865464,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379778,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379778
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.0196438015579248,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.0196438015579248
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5789464689775264,
"mc2_stderr": 0.015807009741465705
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.3419257012888552,
"acc_stderr": 0.0130660896251828
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pankajemplay/llama-intent-1K | ---
dataset_info:
features:
- name: User Query
dtype: string
- name: Intent
dtype: string
- name: id type
dtype: string
- name: id value
dtype: string
- name: id slot filled
dtype: bool
- name: Task
dtype: string
- name: task slot filled
dtype: bool
- name: Bot Response
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 633182
num_examples: 1308
download_size: 189305
dataset_size: 633182
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama-intent-1K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
suyuanliu/Winnf0VOT | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 100267569.0
num_examples: 1800
download_size: 93892942
dataset_size: 100267569.0
---
# Dataset Card for "Winnf0VOT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CristianaLazar/librispeech5k_train | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train.360
num_bytes: 6796635145.0
num_examples: 5000
download_size: 3988908181
dataset_size: 6796635145.0
---
# Dataset Card for "librispeech5k_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_id_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 237881
num_examples: 150
- name: validation
num_bytes: 59860
num_examples: 48
download_size: 72567
dataset_size: 297741
---
# Dataset Card for "squad_id_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jarod0411/zinc10M_linker_v2 | ---
dataset_info:
features:
- name: smiles
dtype: string
- name: p1
dtype: string
- name: p2
dtype: string
splits:
- name: train
num_bytes: 2481183347.0
num_examples: 24088962
- name: validation
num_bytes: 275347726.0
num_examples: 2673196
download_size: 807049756
dataset_size: 2756531073.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
LinQingYang/my_dataset | ---
license: mit
---
|
RafaelBlue/Ranozera3 | ---
license: openrail
---
|
Multimodal-Fatima/Hatefulmemes_test_facebook_opt_6.7b_Hatefulmemes_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 362719567.0
num_examples: 1000
- name: fewshot_3_bs_16
num_bytes: 363587206.0
num_examples: 1000
- name: fewshot_5_bs_16
num_bytes: 364454992.0
num_examples: 1000
- name: fewshot_8_bs_16
num_bytes: 365760377.0
num_examples: 1000
- name: fewshot_10_bs_16
num_bytes: 366632224.0
num_examples: 1000
download_size: 1814428039
dataset_size: 1823154366.0
---
# Dataset Card for "Hatefulmemes_test_facebook_opt_6.7b_Hatefulmemes_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5 | ---
pretty_name: Evaluation run of luffycodes/mcq-vicuna-13b-v1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/mcq-vicuna-13b-v1.5](https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T06:51:11.600921](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5/blob/main/results_2023-10-15T06-51-11.600921.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.28366191275167785,\n\
\ \"em_stderr\": 0.004616354866148242,\n \"f1\": 0.34618708053691377,\n\
\ \"f1_stderr\": 0.004545404408691654,\n \"acc\": 0.40521747299651206,\n\
\ \"acc_stderr\": 0.009982345972620842\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.28366191275167785,\n \"em_stderr\": 0.004616354866148242,\n\
\ \"f1\": 0.34618708053691377,\n \"f1_stderr\": 0.004545404408691654\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \
\ \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.012476433372002604\n\
\ }\n}\n```"
repo_url: https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|arc:challenge|25_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|arc:challenge|25_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T04_41_01.190569
path:
- '**/details_harness|drop|3_2023-10-13T04-41-01.190569.parquet'
- split: 2023_10_15T06_51_11.600921
path:
- '**/details_harness|drop|3_2023-10-15T06-51-11.600921.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T06-51-11.600921.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T04_41_01.190569
path:
- '**/details_harness|gsm8k|5_2023-10-13T04-41-01.190569.parquet'
- split: 2023_10_15T06_51_11.600921
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-51-11.600921.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-51-11.600921.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hellaswag|10_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hellaswag|10_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T05:01:33.006362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T05:01:33.006362.parquet'
- split: 2023_09_01T06_07_11.964362
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T06:07:11.964362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T06:07:11.964362.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T04_41_01.190569
path:
- '**/details_harness|winogrande|5_2023-10-13T04-41-01.190569.parquet'
- split: 2023_10_15T06_51_11.600921
path:
- '**/details_harness|winogrande|5_2023-10-15T06-51-11.600921.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T06-51-11.600921.parquet'
- config_name: results
data_files:
- split: 2023_09_01T05_01_33.006362
path:
- results_2023-09-01T05:01:33.006362.parquet
- split: 2023_09_01T06_07_11.964362
path:
- results_2023-09-01T06:07:11.964362.parquet
- split: 2023_10_13T04_41_01.190569
path:
- results_2023-10-13T04-41-01.190569.parquet
- split: 2023_10_15T06_51_11.600921
path:
- results_2023-10-15T06-51-11.600921.parquet
- split: latest
path:
- results_2023-10-15T06-51-11.600921.parquet
---
# Dataset Card for Evaluation run of luffycodes/mcq-vicuna-13b-v1.5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [luffycodes/mcq-vicuna-13b-v1.5](https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T06:51:11.600921](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5/blob/main/results_2023-10-15T06-51-11.600921.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.28366191275167785,
"em_stderr": 0.004616354866148242,
"f1": 0.34618708053691377,
"f1_stderr": 0.004545404408691654,
"acc": 0.40521747299651206,
"acc_stderr": 0.009982345972620842
},
"harness|drop|3": {
"em": 0.28366191275167785,
"em_stderr": 0.004616354866148242,
"f1": 0.34618708053691377,
"f1_stderr": 0.004545404408691654
},
"harness|gsm8k|5": {
"acc": 0.0803639120545868,
"acc_stderr": 0.007488258573239077
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.012476433372002604
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ckmai24/ghibil-style | ---
license: afl-3.0
---
|
AISE-TUDelft/ML4SE23_G6_Original_Prev_Diverse | ---
license: mit
---
|
samhellkill/spacekitty-v1 | ---
license: other
---
|
liuyanchen1015/MULTI_VALUE_qqp_say_complementizer | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 131067
num_examples: 614
- name: test
num_bytes: 1311590
num_examples: 6202
- name: train
num_bytes: 1212547
num_examples: 5505
download_size: 1612005
dataset_size: 2655204
---
# Dataset Card for "MULTI_VALUE_qqp_say_complementizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
higgsfield/hacker_news_prompt_completion | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 187231365
num_examples: 100000
download_size: 77649586
dataset_size: 187231365
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hacker_news_prompt_completion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/19f9e05b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1332
dataset_size: 186
---
# Dataset Card for "19f9e05b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PVIT/pvit_data_stage2 | ---
license: cc-by-nc-4.0
---
# PVIT dataset
This is the stage 2 pretraining dataset of paper: [Position-Enhanced Visual Instruction Tuning for Multimodal Large Language Models](https://arxiv.org/abs/2308.13437).
## Model description
Position-enhanced Visual Instruction Tuning (PVIT) extends the MLLM by incorporating an additional region-level vision encoder to facilitate support for region-based inputs. Specifically, we adopt the vision encoder from RegionCLIP and utilize it to extract region-level features by taking images and regions as inputs. As an additional source of information, the incorporation of region-level features in this way has a minimal impact on the original MLLM. Furthermore, since the features provided by RegionCLIP are themselves already aligned to the language at a fine-grained level, the overhead of aligning it to the MLLM will be relatively small. Following [LLaVA](https://github.com/haotian-liu/LLaVA), we design a two-stage training strategy for PVIT that first pre-training a linear projection to align the region features to the LLM word embedding, followed by end-to-end fine-tuning to follow complex fine-grained instructions.
For more details, please refer to our [paper](https://arxiv.org/abs/2308.13437) and [github repo](https://github.com/THUNLP-MT/PVIT).
## How to use
See [here](https://github.com/THUNLP-MT/PVIT#Train) for instructions of pretraining.
## Intended use
Primary intended uses: The primary use of PVIT is research on large multimodal models and chatbots.
Primary intended users: The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.
## BibTeX entry and citation info
```bibtex
@misc{chen2023positionenhanced,
title={Position-Enhanced Visual Instruction Tuning for Multimodal Large Language Models},
author={Chi Chen and Ruoyu Qin and Fuwen Luo and Xiaoyue Mi and Peng Li and Maosong Sun and Yang Liu},
year={2023},
eprint={2308.13437},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
jpwahle/autoencoder-paraphrase-dataset | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- machine-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Autoencoder Paraphrase Dataset (BERT, RoBERTa, Longformer)
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- bert
- roberta
- longformer
- plagiarism
- paraphrase
- academic integrity
- arxiv
- wikipedia
- theses
task_categories:
- text-classification
- text-generation
task_ids: []
paperswithcode_id: are-neural-language-models-good-plagiarists-a
dataset_info:
- split: train
download_size: 2980464
dataset_size: 2980464
- split: test
download_size: 1690032
dataset_size: 1690032
---
# Dataset Card for Machine Paraphrase Dataset (MPC)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rat1.ionale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Paper:** https://ieeexplore.ieee.org/document/9651895
- **Total size:** 2.23 GB
- **Train size:** 1.52 GB
- **Test size:** 861 MB
### Dataset Summary
The Autoencoder Paraphrase Corpus (APC) consists of ~200k examples of original, and paraphrases using three neural language models.
It uses three models (BERT, RoBERTa, Longformer) on three source texts (Wikipedia, arXiv, student theses).
The examples are aligned, i.e., we sample the same paragraphs for originals and paraphrased versions.
### How to use it
You can load the dataset using the `load_dataset` function:
```python
from datasets import load_dataset
ds = load_dataset("jpwahle/autoencoder-paraphrase-dataset")
print(ds[0])
#OUTPUT:
{
'text': 'War memorial formally unveiled on Whit Monday 16 May 1921 by the Prince of Wales later King Edward VIII with Lutyens in attendance At the unveiling ceremony Captain Fortescue gave a speech during wherein he announced that 11 600 men and women from Devon had been inval while serving in imperialist war He later stated that some 63 700 8 000 regulars 36 700 volunteers 19 000 conscripts had served in the armed forces The heroism of the dead are recorded on a roll of honour of which three copies were made one for Exeter Cathedral one To be held by Tasman county council and another honoring the Prince of Wales placed in a hollow in bedrock base of the war memorial The princes visit generated considerable excitement in the area Thousands of spectators lined the street to greet his motorcade and shops on Market High Street hung out banners with welcoming messages After the unveiling Edward spent ten days touring the local area',
'label': 1,
'dataset': 'wikipedia',
'method': 'longformer'
}
```
### Supported Tasks and Leaderboards
Paraphrase Identification
### Languages
English
## Dataset Structure
### Data Instances
```json
{
'text': 'War memorial formally unveiled on Whit Monday 16 May 1921 by the Prince of Wales later King Edward VIII with Lutyens in attendance At the unveiling ceremony Captain Fortescue gave a speech during wherein he announced that 11 600 men and women from Devon had been inval while serving in imperialist war He later stated that some 63 700 8 000 regulars 36 700 volunteers 19 000 conscripts had served in the armed forces The heroism of the dead are recorded on a roll of honour of which three copies were made one for Exeter Cathedral one To be held by Tasman county council and another honoring the Prince of Wales placed in a hollow in bedrock base of the war memorial The princes visit generated considerable excitement in the area Thousands of spectators lined the street to greet his motorcade and shops on Market High Street hung out banners with welcoming messages After the unveiling Edward spent ten days touring the local area',
'label': 1,
'dataset': 'wikipedia',
'method': 'longformer'
}
```
### Data Fields
| Feature | Description |
| --- | --- |
| `text` | The unique identifier of the paper. |
| `label` | Whether it is a paraphrase (1) or the original (0). |
| `dataset` | The source dataset (Wikipedia, arXiv, or theses). |
| `method` | The method used (bert, roberta, longformer). |
### Data Splits
- train (Wikipedia x [bert, roberta, longformer])
- test ([Wikipedia, arXiv, theses] x [bert, roberta, longformer])
## Dataset Creation
### Curation Rationale
Providing a resource for testing against autoencoder paraprhased plagiarism.
### Source Data
#### Initial Data Collection and Normalization
- Paragraphs from `featured articles` from the English Wikipedia dump
- Paragraphs from full-text pdfs of arXMLiv
- Paragraphs from full-text pdfs of Czech student thesis (bachelor, master, PhD).
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[Jan Philip Wahle](https://jpwahle.com/)
### Licensing Information
The Autoencoder Paraphrase Dataset is released under CC BY-NC 4.0. By using this corpus, you agree to its usage terms.
### Citation Information
```bib
@inproceedings{9651895,
title = {Are Neural Language Models Good Plagiarists? A Benchmark for Neural Paraphrase Detection},
author = {Wahle, Jan Philip and Ruas, Terry and Meuschke, Norman and Gipp, Bela},
year = 2021,
booktitle = {2021 ACM/IEEE Joint Conference on Digital Libraries (JCDL)},
volume = {},
number = {},
pages = {226--229},
doi = {10.1109/JCDL52503.2021.00065}
}
```
### Contributions
Thanks to [@jpwahle](https://github.com/jpwahle) for adding this dataset. |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_8 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 30601750
num_examples: 3552
download_size: 7874173
dataset_size: 30601750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibivibiv/alpaca_lamini14 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 56180745
num_examples: 129280
download_size: 36259259
dataset_size: 56180745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/joshikouseinomudazukai | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Joshikousei No Mudazukai
This is the image base of bangumi Joshikousei no Mudazukai, we detected 23 characters, 1598 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 202 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 99 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 11 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 19 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 41 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 74 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 271 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 10 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 22 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 7 | [Download](9/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 10 | 11 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 190 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 33 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 79 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 12 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 110 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 14 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 86 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 147 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 6 | [Download](19/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 20 | 5 | [Download](20/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 21 | 6 | [Download](21/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 143 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
vlsp-2023-vllm/en-to-vi-formal-informal-tranlations | ---
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
- name: fewshot_samples
list:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: val
num_bytes: 178154
num_examples: 160
- name: test
num_bytes: 175339
num_examples: 160
download_size: 124988
dataset_size: 353493
---
# Few-shot Translation
## Install
To install `lm-eval` from the github repository main branch, run:
```bash
git clone https://github.com/hieunguyen1053/lm-evaluation-harness
cd lm-evaluation-harness
pip install -e .
```
## Basic Usage
> **Note**: When reporting results from eval harness, please include the task versions (shown in `results["versions"]`) for reproducibility. This allows bug fixes to tasks while also ensuring that previously reported scores are reproducible. See the [Task Versioning](#task-versioning) section for more info.
### Hugging Face `transformers`
To evaluate a model hosted on the [HuggingFace Hub](https://huggingface.co/models) (e.g. vlsp-2023-vllm/hoa-1b4) on `hellaswag_vi` you can use the following command:
```bash
python main.py \
--model hf-causal \
--model_args pretrained=vlsp-2023-vllm/hoa-1b4 \
--tasks translation_vi \
--batch_size auto \
--device cuda:0
```
Additional arguments can be provided to the model constructor using the `--model_args` flag. Most notably, this supports the common practice of using the `revisions` feature on the Hub to store partially trained checkpoints, or to specify the datatype for running a model:
```bash
python main.py \
--model hf-causal \
--model_args pretrained=vlsp-2023-vllm/hoa-1b4,revision=step100000,dtype="float" \
--tasks translation_vi \
--device cuda:0
```
To evaluate models that are loaded via `AutoSeq2SeqLM` in Huggingface, you instead use `hf-seq2seq`. *To evaluate (causal) models across multiple GPUs, use `--model hf-causal-experimental`*
> **Warning**: Choosing the wrong model may result in erroneous outputs despite not erroring. |
tner/btc | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 1k<10K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: BTC
---
# Dataset Card for "tner/btc"
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Paper:** [https://aclanthology.org/C16-1111/](https://aclanthology.org/C16-1111/)
- **Dataset:** Broad Twitter Corpus
- **Domain:** Twitter
- **Number of Entity:** 3
### Dataset Summary
Broad Twitter Corpus NER dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
- Entity Types: `LOC`, `ORG`, `PER`
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```
{
'tokens': ['I', 'hate', 'the', 'words', 'chunder', ',', 'vomit', 'and', 'puke', '.', 'BUUH', '.'],
'tags': [6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6]
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/btc/raw/main/dataset/label.json).
```python
{
"B-LOC": 0,
"B-ORG": 1,
"B-PER": 2,
"I-LOC": 3,
"I-ORG": 4,
"I-PER": 5,
"O": 6
}
```
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
|btc | 6338| 1001|2000|
### Citation Information
```
@inproceedings{derczynski-etal-2016-broad,
title = "Broad {T}witter Corpus: A Diverse Named Entity Recognition Resource",
author = "Derczynski, Leon and
Bontcheva, Kalina and
Roberts, Ian",
booktitle = "Proceedings of {COLING} 2016, the 26th International Conference on Computational Linguistics: Technical Papers",
month = dec,
year = "2016",
address = "Osaka, Japan",
publisher = "The COLING 2016 Organizing Committee",
url = "https://aclanthology.org/C16-1111",
pages = "1169--1179",
abstract = "One of the main obstacles, hampering method development and comparative evaluation of named entity recognition in social media, is the lack of a sizeable, diverse, high quality annotated corpus, analogous to the CoNLL{'}2003 news dataset. For instance, the biggest Ritter tweet corpus is only 45,000 tokens {--} a mere 15{\%} the size of CoNLL{'}2003. Another major shortcoming is the lack of temporal, geographic, and author diversity. This paper introduces the Broad Twitter Corpus (BTC), which is not only significantly bigger, but sampled across different regions, temporal periods, and types of Twitter users. The gold-standard named entity annotations are made by a combination of NLP experts and crowd workers, which enables us to harness crowd recall while maintaining high quality. We also measure the entity drift observed in our dataset (i.e. how entity representation varies over time), and compare to newswire. The corpus is released openly, including source text and intermediate annotations.",
}
``` |
open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B | ---
pretty_name: Evaluation run of posicube/Llama2-chat-AYT-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [posicube/Llama2-chat-AYT-13B](https://huggingface.co/posicube/Llama2-chat-AYT-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T23:47:31.356201](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B/blob/main/results_2023-10-25T23-47-31.356201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02380453020134228,\n\
\ \"em_stderr\": 0.0015611256256327542,\n \"f1\": 0.12621224832214753,\n\
\ \"f1_stderr\": 0.002357573309097525,\n \"acc\": 0.4247779852833908,\n\
\ \"acc_stderr\": 0.009910000290951314\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02380453020134228,\n \"em_stderr\": 0.0015611256256327542,\n\
\ \"f1\": 0.12621224832214753,\n \"f1_stderr\": 0.002357573309097525\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \
\ \"acc_stderr\": 0.007831458737058714\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843915\n\
\ }\n}\n```"
repo_url: https://huggingface.co/posicube/Llama2-chat-AYT-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|arc:challenge|25_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T23_47_31.356201
path:
- '**/details_harness|drop|3_2023-10-25T23-47-31.356201.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T23-47-31.356201.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T23_47_31.356201
path:
- '**/details_harness|gsm8k|5_2023-10-25T23-47-31.356201.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T23-47-31.356201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hellaswag|10_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T13-56-43.141895.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T13-56-43.141895.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T23_47_31.356201
path:
- '**/details_harness|winogrande|5_2023-10-25T23-47-31.356201.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T23-47-31.356201.parquet'
- config_name: results
data_files:
- split: 2023_09_12T13_56_43.141895
path:
- results_2023-09-12T13-56-43.141895.parquet
- split: 2023_10_25T23_47_31.356201
path:
- results_2023-10-25T23-47-31.356201.parquet
- split: latest
path:
- results_2023-10-25T23-47-31.356201.parquet
---
# Dataset Card for Evaluation run of posicube/Llama2-chat-AYT-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/posicube/Llama2-chat-AYT-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [posicube/Llama2-chat-AYT-13B](https://huggingface.co/posicube/Llama2-chat-AYT-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T23:47:31.356201](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B/blob/main/results_2023-10-25T23-47-31.356201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02380453020134228,
"em_stderr": 0.0015611256256327542,
"f1": 0.12621224832214753,
"f1_stderr": 0.002357573309097525,
"acc": 0.4247779852833908,
"acc_stderr": 0.009910000290951314
},
"harness|drop|3": {
"em": 0.02380453020134228,
"em_stderr": 0.0015611256256327542,
"f1": 0.12621224832214753,
"f1_stderr": 0.002357573309097525
},
"harness|gsm8k|5": {
"acc": 0.0887035633055345,
"acc_stderr": 0.007831458737058714
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843915
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Lotem/check | ---
license: bigscience-openrail-m
---
|
Crosstyan/danbooru-public | ---
tags:
- danbooru
---
# Danbooru Public
Danbooru database which including metadata for ["posts"](https://danbooru.donmai.us/wiki_pages/help:posts),
["tags"](https://danbooru.donmai.us/wiki_pages/help:tags) and ["artists"](https://danbooru.donmai.us/artists).
Download from [danbooru public google cloud storage](https://console.cloud.google.com/storage/browser/danbooru_public/data?project=danbooru1).
Updated at 2023/11/30.
Data are encoded with [JSON Lines](https://jsonlines.org/).
```bash
tar -xJf tags.tar.xz
tar -xJf artists.tar.xz
# posts.tar.br is compressed with brotli
# --use-compress-program might also works
# please note that the output is near 20GB
brotli --decompress --stdout posts.tar.br | tar -xf
```
## See also
- [crosstyan/explore-danbooru](https://github.com/crosstyan/explore-danbooru) |
DElmazi/Student_Performance | ---
license: cc-by-4.0
task_categories:
- feature-extraction
language:
- en
tags:
- linear regression
size_categories:
- n<1K
--- |
ayeshgk/java_bug_fix_ctx_err_small | ---
license: mit
dataset_info:
features:
- name: id
dtype: int64
- name: buggy
dtype: string
- name: fixed
dtype: string
- name: bug_err_ctx
dtype: string
splits:
- name: train
num_bytes: 42495
num_examples: 75
- name: validation
num_bytes: 15439
num_examples: 27
- name: test
num_bytes: 2614
num_examples: 6
download_size: 25983
dataset_size: 60548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
severo/test_gated_with_extra_fields | ---
extra_gated_prompt: "You agree not to attempt to determine the identity of individuals in this dataset"
extra_gated_fields:
Company: text
Country: text
I agree to use this model for non-commercial use ONLY: checkbox
--- |
AppleHarem/plume_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of plume (Arknights)
This is the dataset of plume (Arknights), containing 131 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 131 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 335 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 356 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 131 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 131 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 131 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 335 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 335 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 252 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 356 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 356 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
loubnabnl/coffeescript_checks | ---
dataset_info:
features:
- name: entities
list:
- name: context
dtype: string
- name: end
dtype: int64
- name: score
dtype: float32
- name: start
dtype: int64
- name: tag
dtype: string
- name: value
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_count
dtype: int64
- name: content
dtype: string
- name: id
dtype: string
- name: new_content
dtype: string
- name: modified
dtype: bool
- name: references
dtype: string
splits:
- name: train
num_bytes: 202822078.3919738
num_examples: 23874
download_size: 202150872
dataset_size: 202822078.3919738
---
# Dataset Card for "coffeescript_checks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kardosdrur/europarl-scandinavian | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: da
dtype: string
- name: en
dtype: string
- name: sv
dtype: string
splits:
- name: train
num_bytes: 620348322.4
num_examples: 1304296
- name: test
num_bytes: 155087080.6
num_examples: 326074
download_size: 488376564
dataset_size: 775435403.0
---
# Europarl Scandinavian Languages
The data originates from the Europarl parallel corpus, where English transcriptions of parliamentary discussions were aligned
with a number of other languages algorithmically.
In order to align Danish and Swedish corpora in the dataset, English entries were hashed with 128bit Murmurhash3,
and the Danish and Swedish transcriptions were joined on the obtained hash values.
Entries that had more than one pair in the other dataset were removed, this ensures that no false positives due to hash collisions
got into the dataset.
Source code is available in the repository.
The dataset was created for aiding the training of sentence transformer models in the Danish Foundation Models project.
|
incodesatx/siddhu | ---
license: artistic-2.0
---
|
open-llm-leaderboard/details_nbeerbower__bruphin-lambda | ---
pretty_name: Evaluation run of nbeerbower/bruphin-lambda
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/bruphin-lambda](https://huggingface.co/nbeerbower/bruphin-lambda)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__bruphin-lambda\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T15:48:06.692625](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bruphin-lambda/blob/main/results_2024-04-02T15-48-06.692625.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550555752321207,\n\
\ \"acc_stderr\": 0.03195494564401327,\n \"acc_norm\": 0.6541965490035271,\n\
\ \"acc_norm_stderr\": 0.032626014745260334,\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7235913354163169,\n\
\ \"mc2_stderr\": 0.014717519704367223\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n\
\ \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252423\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7133041226847242,\n\
\ \"acc_stderr\": 0.004512940497462743,\n \"acc_norm\": 0.882194781915953,\n\
\ \"acc_norm_stderr\": 0.003217184906847943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903343,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903343\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n\
\ \"acc_stderr\": 0.012757683047716172,\n \"acc_norm\": 0.47783572359843546,\n\
\ \"acc_norm_stderr\": 0.012757683047716172\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7235913354163169,\n\
\ \"mc2_stderr\": 0.014717519704367223\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \
\ \"acc_stderr\": 0.012532334368242887\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/bruphin-lambda
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|arc:challenge|25_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|gsm8k|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hellaswag|10_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-48-06.692625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T15-48-06.692625.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- '**/details_harness|winogrande|5_2024-04-02T15-48-06.692625.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T15-48-06.692625.parquet'
- config_name: results
data_files:
- split: 2024_04_02T15_48_06.692625
path:
- results_2024-04-02T15-48-06.692625.parquet
- split: latest
path:
- results_2024-04-02T15-48-06.692625.parquet
---
# Dataset Card for Evaluation run of nbeerbower/bruphin-lambda
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/bruphin-lambda](https://huggingface.co/nbeerbower/bruphin-lambda) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__bruphin-lambda",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T15:48:06.692625](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bruphin-lambda/blob/main/results_2024-04-02T15-48-06.692625.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6550555752321207,
"acc_stderr": 0.03195494564401327,
"acc_norm": 0.6541965490035271,
"acc_norm_stderr": 0.032626014745260334,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7235913354163169,
"mc2_stderr": 0.014717519704367223
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068744,
"acc_norm": 0.7235494880546075,
"acc_norm_stderr": 0.013069662474252423
},
"harness|hellaswag|10": {
"acc": 0.7133041226847242,
"acc_stderr": 0.004512940497462743,
"acc_norm": 0.882194781915953,
"acc_norm_stderr": 0.003217184906847943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903343,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903343
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.016583881958602394,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.016583881958602394
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716172,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7235913354163169,
"mc2_stderr": 0.014717519704367223
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.7073540561031084,
"acc_stderr": 0.012532334368242887
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ConseggioLigure/seed-instruct-eng-lij | ---
license: cc-by-sa-4.0
task_categories:
- conversational
- translation
pretty_name: OLDI Seed eng-lij translation dataset (instruction-style)
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
- name: template_lang
sequence: string
splits:
- name: train
num_bytes: 2347477
num_examples: 5802
- name: dev
num_bytes: 79012
num_examples: 189
- name: test
num_bytes: 86660
num_examples: 202
download_size: 1299002
dataset_size: 2513149
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
This is an English→Ligurian sentence-level translation dataset.
The original data comes from the [OLDI](https://www.oldi.org) [Seed dataset](https://github.com/openlanguagedata/seed), and it has been converted to the instruction format.
The prompts, written in English, ask the model to translate the text to Ligurian. There are several variants of the prompt which were randomly sampled for each sentence:
The prompts variously refer to the language as Ligurian and Genoese (the specific dialect of Ligurian used in this datset):
```
Translate to Ligurian: \<sentence>
Translate to Ligurian (Genoese): \<sentence>
Translate to Genoese: \<sentence>
Translate from English to Ligurian: \<sentence>
Translate from English to Genoese: \<sentence>
Translate from English to Ligurian (Genoese dialect): \<sentence>
Translate this sentence to Ligurian: \<sentence>
Translate this sentence to Genoese: \<sentence>
What’s the Ligurian translation of this sentence? \<sentence>
What’s the Genoese translation of this sentence? \<sentence>
Can you translate this text to Ligurian? \<sentence>
```
The template used for each dataset entry is referenced in the column `template_id`, with ids ranging from 1 to 11 according to the order given above.
The targets are always prefixed with the string "The Ligurian (Genoese) translation is: \<sentence>".
The correspondence between `template_id`, prompt template and target template is therefore:
```
[
(1, "Translate to Ligurian:\n", "The Ligurian (Genoese) translation is:\n"),
(2, "Translate to Ligurian (Genoese):\n", "The Ligurian (Genoese) translation is:\n"),
(3, "Translate to Genoese:\n", "The Ligurian (Genoese) translation is:\n"),
(4, "Translate from English to Ligurian:\n", "The Ligurian (Genoese) translation is:\n"),
(5, "Translate from English to Genoese:\n", "The Ligurian (Genoese) translation is:\n"),
(6, "Translate from English to Ligurian (Genoese dialect):\n", "The Ligurian (Genoese) translation is:\n"),
(7, "Translate this sentence to Ligurian:\n", "The Ligurian (Genoese) translation is:\n"),
(8, "Translate this sentence to Genoese:\n", "The Ligurian (Genoese) translation is:\n"),
(9, "What’s the Ligurian translation of this sentence?\n", "The Ligurian (Genoese) translation is:\n"),
(10, "What’s the Genoese translation of this sentence?\n", "The Ligurian (Genoese) translation is:\n"),
(11, "Can you translate this text to Ligurian?\n", "The Ligurian (Genoese) translation is:\n"),
]
```
The dataset contains 5802 train samples, 190 validation samples and 201 test samples. |
AnyaSchen/russian_poetry_with_keywords | ---
dataset_info:
features:
- name: text
dtype: string
- name: author
dtype: string
- name: keywords
dtype: string
splits:
- name: train
num_bytes: 4073925
num_examples: 7755
download_size: 2114437
dataset_size: 4073925
---
# Dataset Card for "russian_poetry_with_keywords"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RinaL/lemmy-world-comments | ---
license: apache-2.0
---
This is a data set of lemmy.world comments. |
brandnewx/sd-v1-5 | ---
license: creativeml-openrail-m
---
|
Nexdata/Emotional_Video_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Emotional_Video_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/977?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
1,003 People - Emotional Video Data. The data diversity includes multiple races, multiple indoor scenes, multiple age groups, multiple languages, multiple emotions (11 types of facial emotions, 15 types of inner emotions). For each sentence in each video, emotion types (including facial emotions and inner emotions), start & end time, and text transcription were annotated.This dataset can be used for tasks such as emotion recognition and sentiment analysis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/977?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification, sentiment-recognition: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
English, Chinese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
myanmar_news | ---
annotations_creators:
- found
language_creators:
- found
language:
- my
license:
- gpl-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- topic-classification
pretty_name: MyanmarNews
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype:
class_label:
names:
'0': Sport
'1': Politic
'2': Business
'3': Entertainment
splits:
- name: train
num_bytes: 3797368
num_examples: 8116
download_size: 610592
dataset_size: 3797368
---
# Dataset Card for Myanmar_News
## Dataset Description
- **Repository:** https://github.com/ayehninnkhine/MyanmarNewsClassificationSystem
### Dataset Summary
The Myanmar news dataset contains article snippets in four categories:
Business, Entertainment, Politics, and Sport.
These were collected in October 2017 by Aye Hninn Khine
### Languages
Myanmar/Burmese language
## Dataset Structure
### Data Fields
- text - text from article
- category - a topic: Business, Entertainment, **Politic**, or **Sport** (note spellings)
### Data Splits
One training set (8,116 total rows)
### Source Data
#### Initial Data Collection and Normalization
Data was collected by Aye Hninn Khine
and shared on GitHub with a GPL-3.0 license.
Multiple text files were consolidated into one labeled CSV file by Nick Doiron.
## Additional Information
### Dataset Curators
Contributors to original GitHub repo:
- https://github.com/ayehninnkhine
### Licensing Information
GPL-3.0
### Citation Information
See https://github.com/ayehninnkhine/MyanmarNewsClassificationSystem
### Contributions
Thanks to [@mapmeld](https://github.com/mapmeld) for adding this dataset. |
cyanelis/15485 | ---
license: cc-by-nc-4.0
--- |
qbwmwsap/amber-data-arxiv-chunked-360 | ---
license: mit
dataset_info:
features:
- name: token_ids
sequence: int64
- name: source
dtype: string
splits:
- name: train
num_bytes: 419644080
num_examples: 25560
download_size: 81857266
dataset_size: 419644080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_NLUHOPOE__test-case-0 | ---
pretty_name: Evaluation run of NLUHOPOE/test-case-0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/test-case-0](https://huggingface.co/NLUHOPOE/test-case-0) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__test-case-0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T05:25:06.093843](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-0/blob/main/results_2024-02-16T05-25-06.093843.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5791278236658676,\n\
\ \"acc_stderr\": 0.033494817808173614,\n \"acc_norm\": 0.5837595891503912,\n\
\ \"acc_norm_stderr\": 0.03419368461778056,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4880155663864428,\n\
\ \"mc2_stderr\": 0.015371746911854285\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.01457558392201967,\n\
\ \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520769\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5997809201354312,\n\
\ \"acc_stderr\": 0.004889413126208774,\n \"acc_norm\": 0.796355307707628,\n\
\ \"acc_norm_stderr\": 0.004018847286468061\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548047,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548047\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119994,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119994\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n\
\ \"acc_stderr\": 0.028626547912437378,\n \"acc_norm\": 0.7892156862745098,\n\
\ \"acc_norm_stderr\": 0.028626547912437378\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n\
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.024161618127987745,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.024161618127987745\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101074,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.02611374936131034,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.02611374936131034\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.01461446582196633,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.01461446582196633\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n\
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100786,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100786\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3924380704041721,\n\
\ \"acc_stderr\": 0.01247124366922911,\n \"acc_norm\": 0.3924380704041721,\n\
\ \"acc_norm_stderr\": 0.01247124366922911\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02003639376835263,\n \
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02003639376835263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547735,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547735\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4880155663864428,\n\
\ \"mc2_stderr\": 0.015371746911854285\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3434420015163002,\n \
\ \"acc_stderr\": 0.013079933811800311\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/test-case-0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|arc:challenge|25_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|gsm8k|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hellaswag|10_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T05-25-06.093843.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- '**/details_harness|winogrande|5_2024-02-16T05-25-06.093843.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T05-25-06.093843.parquet'
- config_name: results
data_files:
- split: 2024_02_16T05_25_06.093843
path:
- results_2024-02-16T05-25-06.093843.parquet
- split: latest
path:
- results_2024-02-16T05-25-06.093843.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/test-case-0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/test-case-0](https://huggingface.co/NLUHOPOE/test-case-0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__test-case-0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T05:25:06.093843](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-0/blob/main/results_2024-02-16T05-25-06.093843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5791278236658676,
"acc_stderr": 0.033494817808173614,
"acc_norm": 0.5837595891503912,
"acc_norm_stderr": 0.03419368461778056,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4880155663864428,
"mc2_stderr": 0.015371746911854285
},
"harness|arc:challenge|25": {
"acc": 0.5349829351535836,
"acc_stderr": 0.01457558392201967,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520769
},
"harness|hellaswag|10": {
"acc": 0.5997809201354312,
"acc_stderr": 0.004889413126208774,
"acc_norm": 0.796355307707628,
"acc_norm_stderr": 0.004018847286468061
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548047,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548047
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.02491524398598785,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.02491524398598785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119994,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119994
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437378,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437378
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.024161618127987745,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.024161618127987745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101074,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.02611374936131034,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.02611374936131034
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.01461446582196633,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.01461446582196633
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.027305308076274695,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.027305308076274695
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100786,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100786
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3924380704041721,
"acc_stderr": 0.01247124366922911,
"acc_norm": 0.3924380704041721,
"acc_norm_stderr": 0.01247124366922911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02003639376835263,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02003639376835263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547735,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547735
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4880155663864428,
"mc2_stderr": 0.015371746911854285
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.3434420015163002,
"acc_stderr": 0.013079933811800311
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
biadrivex/yong | ---
license: openrail
---
|
JianhaoDYDY/sample | ---
language:
- en
license: apache-2.0
task_categories:
- image-classification
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 2742.0
num_examples: 1
download_size: 20101
dataset_size: 2742.0
---
|
yotam56/hugo_tsne_ds | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': dresses
'1': jackets
'2': man_hoodie
'3': red_tshirts
'4': suits
'5': white_tshirts
'6': women_pants
'7': women_shorts
'8': women_skirts
splits:
- name: train
num_bytes: 358596.0
num_examples: 45
download_size: 367026
dataset_size: 358596.0
---
# Dataset Card for "hugo_tsne_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OzoneAsai/calculation | ---
license: wtfpl
tag: conversational
task_categories:
- conversational
language:
- en
- zh
- de
- ru
- ko
- fr
- ja
---
# Dataset Card for Calculation
### size
JSON file: output1.json≒1.3GB
~
output60.json
In total 70 ~ 80GB
### Dataset Summary
**en**: Calculation. Its range will be expanded later.
**zh**: 计算。其范围将在以后扩展。
**de**: Berechnung. Der Umfang wird später erweitert werden.
**ru**: Расчет. Его диапазон будет расширен позже.
**ko**: 계산. 범위는 나중에 확장될 것입니다.
**fr**: Calcul. Sa portée sera étendue ultérieurement.
**ja**: 計算。範囲は後で拡張されます。
### Supported Tasks and Leaderboards
**en**: conversation, instruction
**zh**: 会话,指令
**de**: Unterhaltung, Anweisung
**ru**: разговор, инструкция
**ko**: 대화, 지시사항
**fr**: conversation, instruction
**ja**: 会話、指示
### Languages
**en**: It only used numbers and symbols. So any language is able to use this.
**zh**: 该数据集只使用数字和符号。因此任何语言都可以使用它。
**de**: Es werden nur Zahlen und Symbole verwendet. Daher kann diese Datenbank von jeder Sprache verwendet werden.
**ru**: В нем используются только цифры и символы. Таким образом, любой язык может использовать его.
**ko**: 숫자와 기호만 사용되었습니다. 그래서 모든 언어에서 사용할 수 있습니다.
**fr**: Il n'utilise que des chiffres et des symboles. Ainsi, n'importe quelle langue peut l'utiliser.
**ja**: 数字と記号のみが使用されています。したがって、どんな言語でも使用できます.
## Dataset Structure
Input, output,
## Translation
Translated by ChatGPT |
jhuang14/Labeled_Data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': bustruck
'2': other
'3': rail
splits:
- name: train
num_bytes: 1652124.1515151516
num_examples: 92
- name: test
num_bytes: 718314.8484848485
num_examples: 40
download_size: 2372957
dataset_size: 2370439.0
---
# Dataset Card for "Labeled_Data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/scarlet_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of scarlet/紅蓮/红莲/홍련 (Nikke: Goddess of Victory)
This is the dataset of scarlet/紅蓮/红莲/홍련 (Nikke: Goddess of Victory), containing 85 images and their tags.
The core tags of this character are `long_hair, breasts, bangs, large_breasts, red_eyes, white_hair, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 85 | 193.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scarlet_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 85 | 83.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scarlet_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 220 | 186.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scarlet_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 85 | 157.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scarlet_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 220 | 310.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scarlet_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scarlet_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, holding_sword, black_bodysuit, looking_at_viewer, smile |
| 1 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, penis, looking_at_viewer, mosaic_censoring, nipples, open_mouth, sex, sweat, collarbone, completely_nude, cowgirl_position, girl_on_top, hair_between_eyes, hairband, pussy, thighs, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | holding_sword | black_bodysuit | looking_at_viewer | smile | 1boy | blush | hetero | solo_focus | penis | mosaic_censoring | nipples | open_mouth | sex | sweat | collarbone | completely_nude | cowgirl_position | girl_on_top | hair_between_eyes | hairband | pussy | thighs | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------|:-----------------|:--------------------|:--------|:-------|:--------|:---------|:-------------|:--------|:-------------------|:----------|:-------------|:------|:--------|:-------------|:------------------|:-------------------|:--------------|:--------------------|:-----------|:--------|:---------|:----------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kortukov/answer-equivalence-dataset | ---
license: apache-2.0
task_categories:
- text-classification
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: train
path: "train.jsonl.zip"
- split: test
path: "ae_test.jsonl.zip"
- split: dev
path: "ae_dev.jsonl.zip"
- split: dev_bidaf
path: "dev_bidaf.jsonl.zip"
- split: dev_xlnet
path: "dev_xlnet.jsonl.zip"
- split: dev_luke
path: "dev_luke.jsonl.zip"
---
# Answer Equivalence Dataset
This dataset is introduced and described in [Tomayto, Tomahto. Beyond Token-level Answer Equivalence for Question Answering Evaluation](http://arxiv.org/abs/2202.07654).
## Source
This is a repost. The original dataset repository [can be found here.](https://github.com/google-research-datasets/answer-equivalence-dataset/tree/main)
## Data splits and sizes
| AE Split | # AE Examples | # Ratings |
|-----------|---------------|-----------|
| Train | 9,090 | 9,090 |
| Dev | 2,734 | 4,446 |
| Test | 5,831 | 9,724 |
| Total | 17,655 | 23,260 |
| Split by system | # AE Examples | # Ratings |
|------------------|---------------|-----------|
| BiDAF dev predictions | 5622 | 7522 |
| XLNet dev predictions | 2448 | 7932 |
| Luke dev predictions | 2240 | 4590 |
| Total | 8,565 | 14,170 |
## BERT Matching (BEM) model
The BEM model from the paper, finetuned on this dataset, is available on [tfhub](https://tfhub.dev/google/answer_equivalence/bem/1).
This [colab](https://colab.research.google.com/github/google-research-datasets/answer-equivalence-dataset/blob/main/Answer_Equivalence_BEM_example.ipynb) demonstrates how to use it.
## How to cite AE?
```
@article{bulian-etal-2022-tomayto,
author = {Jannis Bulian and
Christian Buck and
Wojciech Gajewski and
Benjamin B{\"o}rschinger and
Tal Schuster},
title = {Tomayto, Tomahto. Beyond Token-level Answer Equivalence
for Question Answering Evaluation},
journal = {CoRR},
volume = {abs/2202.07654},
year = {2022},
ee = {http://arxiv.org/abs/2202.07654},
}
```
## Disclaimer
This is not an official Google product.
## Contact information
For help or issues, please submit [a GitHub issue to this repostory](https://github.com/google-research-datasets/answer-equivalence-dataset/tree/main) or contact the authors by email.
|
autoevaluate/autoeval-staging-eval-project-sasha__dog-food-8a6c4abe-13775898 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- sasha/dog-food
eval_info:
task: image_binary_classification
model: sasha/dog-food-swin-tiny-patch4-window7-224
metrics: ['matthews_correlation']
dataset_name: sasha/dog-food
dataset_config: sasha--dog-food
dataset_split: train
col_mapping:
image: image
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Image Classification
* Model: sasha/dog-food-swin-tiny-patch4-window7-224
* Dataset: sasha/dog-food
* Config: sasha--dog-food
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ahmetgunduz](https://huggingface.co/ahmetgunduz) for evaluating this model. |
minh21/COVID-QA-testset-biencoder-data-45_45_10 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: context_chunks
sequence: string
- name: document_id
dtype: int64
- name: id
dtype: int64
- name: context
dtype: string
splits:
- name: train
num_bytes: 16708455
num_examples: 201
download_size: 442083
dataset_size: 16708455
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "COVID-QA-testset-biencoder-data-45_45_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yoonlee/abnormal_cat | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6845620.0
num_examples: 9
download_size: 6847520
dataset_size: 6845620.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "abnormal_cat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-109000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1049320
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge | ---
pretty_name: Evaluation run of abhishekchohan/mistral-7B-forest-merge
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishekchohan/mistral-7B-forest-merge](https://huggingface.co/abhishekchohan/mistral-7B-forest-merge)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T23:23:15.649063](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge/blob/main/results_2024-01-21T23-23-15.649063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6022067316089463,\n\
\ \"acc_stderr\": 0.032877722301518426,\n \"acc_norm\": 0.6045609403878123,\n\
\ \"acc_norm_stderr\": 0.03353760382711908,\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5748469157653282,\n\
\ \"mc2_stderr\": 0.015758784357589765\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6519617606054571,\n\
\ \"acc_stderr\": 0.004753746951620151,\n \"acc_norm\": 0.8440549691296555,\n\
\ \"acc_norm_stderr\": 0.0036206175507473956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.02564938106302927,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.02564938106302927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846486,\n\
\ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n\
\ \"acc_stderr\": 0.014774358319934499,\n \"acc_norm\": 0.7816091954022989,\n\
\ \"acc_norm_stderr\": 0.014774358319934499\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.01611523550486548,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.01611523550486548\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684972,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684972\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153273,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153273\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.01268590653820624,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.01268590653820624\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440307,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5748469157653282,\n\
\ \"mc2_stderr\": 0.015758784357589765\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712664\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.511751326762699,\n \
\ \"acc_stderr\": 0.013768680408142806\n }\n}\n```"
repo_url: https://huggingface.co/abhishekchohan/mistral-7B-forest-merge
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|arc:challenge|25_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|arc:challenge|25_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|gsm8k|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|gsm8k|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hellaswag|10_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hellaswag|10_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-19-15.004437.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T23-23-15.649063.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- '**/details_harness|winogrande|5_2024-01-21T23-19-15.004437.parquet'
- split: 2024_01_21T23_23_15.649063
path:
- '**/details_harness|winogrande|5_2024-01-21T23-23-15.649063.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T23-23-15.649063.parquet'
- config_name: results
data_files:
- split: 2024_01_21T23_19_15.004437
path:
- results_2024-01-21T23-19-15.004437.parquet
- split: 2024_01_21T23_23_15.649063
path:
- results_2024-01-21T23-23-15.649063.parquet
- split: latest
path:
- results_2024-01-21T23-23-15.649063.parquet
---
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-merge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest-merge](https://huggingface.co/abhishekchohan/mistral-7B-forest-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T23:23:15.649063](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-merge/blob/main/results_2024-01-21T23-23-15.649063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6022067316089463,
"acc_stderr": 0.032877722301518426,
"acc_norm": 0.6045609403878123,
"acc_norm_stderr": 0.03353760382711908,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5748469157653282,
"mc2_stderr": 0.015758784357589765
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.6519617606054571,
"acc_stderr": 0.004753746951620151,
"acc_norm": 0.8440549691296555,
"acc_norm_stderr": 0.0036206175507473956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302927,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098822,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098822
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934499,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934499
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.01611523550486548,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.01611523550486548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684972,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684972
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153273,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.01268590653820624,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.01268590653820624
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.019751726508762637,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.019751726508762637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440307,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5748469157653282,
"mc2_stderr": 0.015758784357589765
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712664
},
"harness|gsm8k|5": {
"acc": 0.511751326762699,
"acc_stderr": 0.013768680408142806
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CATIE-AQ/xwinograd_fr_prompt_coreference | ---
language:
- fr
license:
- cc-by-4.0
size_categories:
- n<1K
tags:
- coreference
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- xwinograd
---
# xwinograd_fr_prompt_coreference
## Summary
**xwinograd_fr_prompt_coreference** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **830** rows that can be used for a coreference task.
The original data (without prompts) comes from the dataset [xwinograd](https://huggingface.co/datasets/Muennighoff/xwinograd) by Muennighoff where only the French part has been kept.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
10 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'"'+sentence+'"\nRemplacer le "_" dans la phrase ci-dessus par la bonne option :\n- "'+option1+'"\n- "'+option2+'"',
'"'+sentence+'"\nRemplace le "_" dans la phrase ci-dessus par la bonne option :\n- "'+option1+'"\n- "'+option2+'"',
'"'+sentence+'"\nRemplacez le "_" dans la phrase ci-dessus par la bonne option :\n- "'+option1+'"\n- "'+option2+'"',
'"'+sentence+'" Dans la phrase précédente, "_" fait-il référence à "'+option1+'" ou "'+option2+'" ?',
'"'+sentence+'" À quoi le "_" dans la phrase ci-dessus fait-il référence ? "'+option1+'" ou "'+option2+'" ?',
'"'+sentence+'" Le "_" dans la phrase ci-dessous fait référence à "'+option1+'"\n- "'+option2+'" ?',
'Remplisser le "_" de la phrase suivante : "'+sentence+ '"\nChoix :\n- "'+option1+'"\n- "'+option2+'"\nRéponse :',
'Remplis le "_" de la phrase suivante : "'+sentence+ '"\nChoix :\n- "'+option1+'"\n- "'+option2+'"\nRéponse :',
'Remplissez le "_" de la phrase suivante : "'+sentence+ '"\nChoix :\n- "'+option1+'"\n- "'+option2+'"\nRéponse :',
'Dans la phrase ci-dessous, le "_" renvoie-t-il à "'+option1+'" ou "'+option2+'" ? : '+sentence,
```
### Features used in the prompts
In the prompt list above, `option1`, `option2`, `sentence` and `targets` have been constructed from:
```
xwinograd = load_dataset('Muennighoff/xwinograd','fr')
sentence = xwinograd['test'][i]['sentence']
option1 = xwinograd['test'][i]['option1']
option2 = xwinograd['test'][i]['option2']
targets = str(xwinograd['test'][i]['answer']).replace("1",xwinograd['test'][i]['option1']).replace("2",xwinograd['test'][i]['option2'])
```
# Splits
- `train` with 830 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/xwinograd_fr_prompt_coreference")
```
# Citation
## Original data
> @misc{muennighoff2022crosslingual,
title={Crosslingual Generalization through Multitask Finetuning},
author={Niklas Muennighoff and Thomas Wang and Lintang Sutawika and Adam Roberts and Stella Biderman and Teven Le Scao and M Saiful Bari and Sheng Shen and Zheng-Xin Yong and Hailey Schoelkopf and Xiangru Tang and Dragomir Radev and Alham Fikri Aji and Khalid Almubarak and Samuel Albanie and Zaid Alyafeai and Albert Webson and Edward Raff and Colin Raffel},
year={2022},
eprint={2211.01786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
> @misc{tikhonov2021heads,
title={It's All in the Heads: Using Attention Heads as a Baseline for Cross-Lingual Transfer in Commonsense Reasoning},
author={Alexey Tikhonov and Max Ryabinin},
year={2021},
eprint={2106.12066},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
[cc-by-4.0](https://creativecommons.org/licenses/by/4.0/deed.en) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_248 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 925956496.0
num_examples: 180428
download_size: 946237901
dataset_size: 925956496.0
---
# Dataset Card for "chunk_248"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_participle_past_tense | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 126913
num_examples: 609
- name: test
num_bytes: 1380942
num_examples: 6474
- name: train
num_bytes: 1258173
num_examples: 5783
download_size: 1709920
dataset_size: 2766028
---
# Dataset Card for "MULTI_VALUE_qqp_participle_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
causal-lm/instinwild | ---
language: en
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 35761516
num_examples: 46971
- name: validation
num_bytes: 4012755
num_examples: 5220
download_size: 22678351
dataset_size: 39774271
---
# Dataset Card for "instinwild"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp | ---
pretty_name: Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Bagel-Hermes-34B-Slerp](https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T01:56:18.562449](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp/blob/main/results_2024-01-14T01-56-18.562449.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7687638749469244,\n\
\ \"acc_stderr\": 0.02791668972955577,\n \"acc_norm\": 0.7731851983230489,\n\
\ \"acc_norm_stderr\": 0.028441222412067358,\n \"mc1\": 0.4969400244798042,\n\
\ \"mc1_stderr\": 0.01750317326096062,\n \"mc2\": 0.6709148255495884,\n\
\ \"mc2_stderr\": 0.014645409374455808\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619422\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n\
\ \"acc_stderr\": 0.004714386376337134,\n \"acc_norm\": 0.8568014339772954,\n\
\ \"acc_norm_stderr\": 0.0034955936625207526\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.02629399585547494,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.02629399585547494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8188679245283019,\n \"acc_stderr\": 0.023702963526757798,\n\
\ \"acc_norm\": 0.8188679245283019,\n \"acc_norm_stderr\": 0.023702963526757798\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n\
\ \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n\
\ \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\":\
\ 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349414,\n\
\ \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349414\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6931216931216931,\n \"acc_stderr\": 0.02375292871211213,\n \"\
acc_norm\": 0.6931216931216931,\n \"acc_norm_stderr\": 0.02375292871211213\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488313,\n \"\
acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488313\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"\
acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.01889552448260495,\n \
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.01889552448260495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \
\ \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n\
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5364238410596026,\n \"acc_stderr\": 0.04071636065944217,\n \"\
acc_norm\": 0.5364238410596026,\n \"acc_norm_stderr\": 0.04071636065944217\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862088,\n \"\
acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862088\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280226,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280226\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n\
\ \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253876,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253876\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n\
\ \"acc_stderr\": 0.010397417087292849,\n \"acc_norm\": 0.9067688378033205,\n\
\ \"acc_norm_stderr\": 0.010397417087292849\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.02077676110251298,\n\
\ \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.02077676110251298\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.794413407821229,\n\
\ \"acc_stderr\": 0.013516116210724202,\n \"acc_norm\": 0.794413407821229,\n\
\ \"acc_norm_stderr\": 0.013516116210724202\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213505,\n\
\ \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213505\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n\
\ \"acc_stderr\": 0.021355343028264053,\n \"acc_norm\": 0.8295819935691319,\n\
\ \"acc_norm_stderr\": 0.021355343028264053\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n\
\ \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.648936170212766,\n \"acc_stderr\": 0.028473501272963758,\n \
\ \"acc_norm\": 0.648936170212766,\n \"acc_norm_stderr\": 0.028473501272963758\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6029986962190352,\n\
\ \"acc_stderr\": 0.012496346982909556,\n \"acc_norm\": 0.6029986962190352,\n\
\ \"acc_norm_stderr\": 0.012496346982909556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8419117647058824,\n \"acc_stderr\": 0.022161462608068522,\n\
\ \"acc_norm\": 0.8419117647058824,\n \"acc_norm_stderr\": 0.022161462608068522\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n\
\ \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4969400244798042,\n\
\ \"mc1_stderr\": 0.01750317326096062,\n \"mc2\": 0.6709148255495884,\n\
\ \"mc2_stderr\": 0.014645409374455808\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873492\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \
\ \"acc_stderr\": 0.013023665136222096\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-18.562449.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- '**/details_harness|winogrande|5_2024-01-14T01-56-18.562449.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T01-56-18.562449.parquet'
- config_name: results
data_files:
- split: 2024_01_14T01_56_18.562449
path:
- results_2024-01-14T01-56-18.562449.parquet
- split: latest
path:
- results_2024-01-14T01-56-18.562449.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-34B-Slerp](https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:56:18.562449](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp/blob/main/results_2024-01-14T01-56-18.562449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7687638749469244,
"acc_stderr": 0.02791668972955577,
"acc_norm": 0.7731851983230489,
"acc_norm_stderr": 0.028441222412067358,
"mc1": 0.4969400244798042,
"mc1_stderr": 0.01750317326096062,
"mc2": 0.6709148255495884,
"mc2_stderr": 0.014645409374455808
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.7073378839590444,
"acc_norm_stderr": 0.013295916103619422
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337134,
"acc_norm": 0.8568014339772954,
"acc_norm_stderr": 0.0034955936625207526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.02629399585547494,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.02629399585547494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8188679245283019,
"acc_stderr": 0.023702963526757798,
"acc_norm": 0.8188679245283019,
"acc_norm_stderr": 0.023702963526757798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349414,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349414
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6931216931216931,
"acc_stderr": 0.02375292871211213,
"acc_norm": 0.6931216931216931,
"acc_norm_stderr": 0.02375292871211213
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.016306570644488313,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.016306570644488313
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.01889552448260495,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.01889552448260495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5364238410596026,
"acc_stderr": 0.04071636065944217,
"acc_norm": 0.5364238410596026,
"acc_norm_stderr": 0.04071636065944217
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862088,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862088
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280226,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280226
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253876,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253876
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292849,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.02077676110251298,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.02077676110251298
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.794413407821229,
"acc_stderr": 0.013516116210724202,
"acc_norm": 0.794413407821229,
"acc_norm_stderr": 0.013516116210724202
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213505,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213505
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.021355343028264053,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.021355343028264053
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.648936170212766,
"acc_stderr": 0.028473501272963758,
"acc_norm": 0.648936170212766,
"acc_norm_stderr": 0.028473501272963758
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6029986962190352,
"acc_stderr": 0.012496346982909556,
"acc_norm": 0.6029986962190352,
"acc_norm_stderr": 0.012496346982909556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8419117647058824,
"acc_stderr": 0.022161462608068522,
"acc_norm": 0.8419117647058824,
"acc_norm_stderr": 0.022161462608068522
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4969400244798042,
"mc1_stderr": 0.01750317326096062,
"mc2": 0.6709148255495884,
"mc2_stderr": 0.014645409374455808
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873492
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222096
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ajyy/MELD_audio | ---
dataset_info:
- config_name: MELD_Audio
features:
- name: text
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: emotion
dtype:
class_label:
names:
'0': neutral
'1': joy
'2': sadness
'3': anger
'4': fear
'5': disgust
'6': surprise
- name: sentiment
dtype:
class_label:
names:
'0': neutral
'1': positive
'2': negative
splits:
- name: train
num_bytes: 3629722
num_examples: 9988
- name: validation
num_bytes: 411341
num_examples: 1108
- name: test
num_bytes: 945283
num_examples: 2610
download_size: 7840135137
dataset_size: 4986346
license: gpl-3.0
language:
- en
pretty_name: MELD
size_categories:
- 10K<n<100K
tags:
- speech-emotion-recognition
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
The Audio, Speech, and Vision Processing Lab - Emotional Sound Database (ASVP - ESD)
## Dataset Details
### Dataset Description
Multimodal EmotionLines Dataset (MELD) has been created by enhancing and extending EmotionLines dataset.
MELD contains the same dialogue instances available in EmotionLines, but it also encompasses audio and
visual modality along with text. MELD has more than 1400 dialogues and 13000 utterances from Friends TV series.
Multiple speakers participated in the dialogues. Each utterance in a dialogue has been labeled by any of these
seven emotions -- Anger, Disgust, Sadness, Joy, Neutral, Surprise and Fear. MELD also has sentiment (positive,
negative and neutral) annotation for each utterance.
This dataset is modified from https://huggingface.co/datasets/zrr1999/MELD_Text_Audio.
The audio is extracted from MELD mp4 files while the audio only has one channel with sample rate 16khz.
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
davidberenstein1957/ultra-feedback-dutch-cleaned-hq_iter0 | ---
dataset_info:
features:
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 82544374.21505675
num_examples: 19426
- name: test
num_bytes: 9173957.784943247
num_examples: 2159
download_size: 51274646
dataset_size: 91718332.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "ultra-feedback-dutch-cleaned-hq_iter0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_num_v5_full_recite_full_passage_random_permute_rerun_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 5985907.839669421
num_examples: 3365
- name: validation
num_bytes: 580808
num_examples: 300
download_size: 1607029
dataset_size: 6566715.839669421
---
# Dataset Card for "squad_qa_num_v5_full_recite_full_passage_random_permute_rerun_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ngxingyu/iwslt17_google_trans | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: en
dtype: string
- name: google_zh
dtype: string
- name: zh
dtype: string
splits:
- name: train
num_bytes: 65469523
num_examples: 231266
- name: validation
num_bytes: 292199
num_examples: 879
- name: test
num_bytes: 2360603
num_examples: 8549
download_size: 44559127
dataset_size: 68122325
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Romecr/testImages | ---
license: other
---
|
andersonbcdefg/SPECTER-subset-dedup | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 556150901.5824461
num_examples: 128807
download_size: 319036990
dataset_size: 556150901.5824461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jxie/epsilon | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 9604800000
num_examples: 400000
- name: test
num_bytes: 2401200000
num_examples: 100000
download_size: 7805263919
dataset_size: 12006000000
---
# Dataset Card for "epsilon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/re_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 19219537
num_examples: 3572
- name: valid
num_bytes: 1626844
num_examples: 305
download_size: 1753501
dataset_size: 20846381
---
# Dataset Card for "re_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marcos292/LPMark | ---
license: openrail
---
|
bnsapa/cybersecurity-ner | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-Indicator
'1': B-Malware
'2': B-Organization
'3': B-System
'4': B-Vulnerability
'5': I-Indicator
'6': I-Malware
'7': I-Organization
'8': I-System
'9': I-Vulnerability
'10': O
splits:
- name: train
num_bytes: 1197515
num_examples: 2664
- name: test
num_bytes: 336600
num_examples: 717
- name: validation
num_bytes: 339858
num_examples: 785
download_size: 385026
dataset_size: 1873973
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
AndrewTsai0406/RGB_zh | ---
dataset_info:
features:
- name: id
dtype: int64
- name: query
dtype: string
- name: answer
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 5871582
num_examples: 300
download_size: 3226142
dataset_size: 5871582
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Minata/512_block_tokenized_src_fm_fc_ms_ff_method2testcases_v0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2980662680
num_examples: 447010
- name: test
num_bytes: 282063068
num_examples: 42301
download_size: 541623207
dataset_size: 3262725748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Deepank/CITYLID | ---
license: mit
language:
- en
- de
tags:
- code
- aerial point-cloud
- point-cloud classification
- urban streetscapes
- cross-sections
pretty_name: CTLID
---
# CITYLID: A large-scale categorized aerial Lidar dataset for street-level research
<!-- Provide a quick summary of the dataset. -->
This repository is dedicated to providing categorized aerial Lidar datasets along with the methodology for data preparation.
Details regarding data preparation and usage are given in the [GitHub Repository](https://github.com/deepankverma/navigating_streetscapes)
### Dataset Description
The dataset covers the entire state of Berlin and is divided into 1060 tiles of 1 sq. km each. The tiles are further grouped under
[9 regions](https://fbinter.stadt-berlin.de/fb/atom/DOP/Blattschnitt2x2km.gif). The dataset comprises (a) [Categorized Point clouds](Lidar_point_clouds)
and (b) [Raster image files providing solar radiation maps](solar_radiation_rasters). The details regarding the
data preparation can be found in [GitHub Repository](https://github.com/deepankverma/navigating_streetscapes).
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
[Verma, D., Mumm, O., & Carlow, V. M. (2023). Generating citywide street cross-sections using aerial LiDAR and detailed street plan. Sustainable Cities and Society, 96, 104673](https://www.sciencedirect.com/science/article/pii/S2210670723002846) |
FINNUMBER/FINCH_TRAIN_NQA_ARI_100 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 282939
num_examples: 100
download_size: 176220
dataset_size: 282939
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
linqus/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 1717058
num_examples: 100
download_size: 564909
dataset_size: 1717058
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-8ddaed-1457553860 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: ARTeLab/it5-summarization-fanpage
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: train
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-fanpage
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ehahaha](https://huggingface.co/ehahaha) for evaluating this model. |
Elfsong/Bias_NLI | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 136716914
num_examples: 1912390
download_size: 48712225
dataset_size: 136716914
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/autotree_pmlb_100000_banana_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 1545200000
num_examples: 100000
- name: validation
num_bytes: 154520000
num_examples: 10000
download_size: 281108655
dataset_size: 1699720000
---
# Dataset Card for "autotree_pmlb_100000_banana_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vkaradeniz/moneypay_sss_final_english | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 23898
num_examples: 74
download_size: 18721
dataset_size: 23898
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ricahrd/MckevinV4 | ---
license: openrail
---
|
yezhengli9/opus_books_demo | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 32997043
num_examples: 127085
download_size: 20985324
dataset_size: 32997043
---
# Dataset Card for "opus_books_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
r0ll/zelensky | ---
license: openrail
language:
- us
- ua
- ru
---
Vladimir Zelenskiy 1500 epoch RVC v2.
For a good sound, set the pitch to - |
andersonbcdefg/amazon_qa_pairs_processed | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
splits:
- name: train
num_bytes: 837921623
num_examples: 2507114
download_size: 416692810
dataset_size: 837921623
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zliu333/truck_at_port | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 72377986.0
num_examples: 50
download_size: 72368630
dataset_size: 72377986.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FreedomIntelligence/huatuo_knowledge_graph_qa | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
tags:
- medical
size_categories:
- 100K<n<1M
---
# Dataset Card for Huatuo_knowledge_graph_qa
## Dataset Description
- **Homepage: https://www.huatuogpt.cn/**
- **Repository: https://github.com/FreedomIntelligence/HuatuoGPT**
- **Paper: https://arxiv.org/abs/2305.01526**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
We built this QA dataset based on the medical knowledge map, with a total of 798,444 pieces of data, in which the questions are constructed by means of templates, and the answers are the contents of the entries in the knowledge map.
## Dataset Creation
### Source Data
https://cpubmed.openi.org.cn/graph/wiki
https://github.com/zhihao-chen/QASystemOnMedicalGraph
https://github.com/baiyang2464/chatbot-base-on-Knowledge-Graph
## Citation
```
@misc{li2023huatuo26m,
title={Huatuo-26M, a Large-scale Chinese Medical QA Dataset},
author={Jianquan Li and Xidong Wang and Xiangbo Wu and Zhiyi Zhang and Xiaolong Xu and Jie Fu and Prayag Tiwari and Xiang Wan and Benyou Wang},
year={2023},
eprint={2305.01526},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Sujithanumala/TheFinalPropagandaDataset | ---
dataset_info:
features:
- name: content
dtype: string
- name: labels
sequence: string
splits:
- name: train
num_bytes: 7806192
num_examples: 1062
download_size: 1241070
dataset_size: 7806192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CoCoRooXin/eu_topic_inclusion | ---
dataset_info:
features:
- name: topic
dtype: string
- name: syntagm
dtype: string
- name: labels
dtype: int64
- name: root
dtype: string
splits:
- name: train
num_bytes: 2330816
num_examples: 34295
- name: test
num_bytes: 496594
num_examples: 7350
- name: eval
num_bytes: 499052
num_examples: 7349
download_size: 1104598
dataset_size: 3326462
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
---
|
open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0 | ---
pretty_name: Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abdulrahman-nuzha/finetuned-llama-v2.0](https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T10:47:40.022995](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0/blob/main/results_2023-12-10T10-47-40.022995.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43945994951550066,\n\
\ \"acc_stderr\": 0.034385529407471936,\n \"acc_norm\": 0.4442918982351828,\n\
\ \"acc_norm_stderr\": 0.035190222707291795,\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.3908033560283727,\n\
\ \"mc2_stderr\": 0.013656125379191442\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.014600132075947087,\n\
\ \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5789683330013942,\n\
\ \"acc_stderr\": 0.0049271558825981845,\n \"acc_norm\": 0.7775343557060347,\n\
\ \"acc_norm_stderr\": 0.0041505226302310265\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.02201908001221789,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02201908001221789\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n\
\ \"acc_stderr\": 0.02815603653823321,\n \"acc_norm\": 0.4290322580645161,\n\
\ \"acc_norm_stderr\": 0.02815603653823321\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n\
\ \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.024838811988033158,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.024838811988033158\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n\
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5853211009174312,\n \"acc_stderr\": 0.021122903208602585,\n \"\
acc_norm\": 0.5853211009174312,\n \"acc_norm_stderr\": 0.021122903208602585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18055555555555555,\n \"acc_stderr\": 0.02623287897149166,\n \"\
acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.02623287897149166\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524867,\n \"\
acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524867\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693334,\n \
\ \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693334\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436972,\n\
\ \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436972\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.030463656747340275,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.030463656747340275\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6002554278416348,\n\
\ \"acc_stderr\": 0.017516847907053282,\n \"acc_norm\": 0.6002554278416348,\n\
\ \"acc_norm_stderr\": 0.017516847907053282\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n \
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.028217683556652308,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.028217683556652308\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.027820214158594363,\n\
\ \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.027820214158594363\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32269503546099293,\n \"acc_stderr\": 0.02788913930053478,\n \
\ \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.02788913930053478\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32920469361147325,\n\
\ \"acc_stderr\": 0.012002091666902297,\n \"acc_norm\": 0.32920469361147325,\n\
\ \"acc_norm_stderr\": 0.012002091666902297\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329387,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329387\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928554,\n \
\ \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928554\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.45454545454545453,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.363265306122449,\n \"acc_stderr\": 0.03078905113903081,\n\
\ \"acc_norm\": 0.363265306122449,\n \"acc_norm_stderr\": 0.03078905113903081\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n\
\ \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n\
\ \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.0368713061556206,\n\
\ \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.0368713061556206\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.3908033560283727,\n\
\ \"mc2_stderr\": 0.013656125379191442\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \
\ \"acc_stderr\": 0.008238371412683965\n }\n}\n```"
repo_url: https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|arc:challenge|25_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|gsm8k|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hellaswag|10_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T10-47-40.022995.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- '**/details_harness|winogrande|5_2023-12-10T10-47-40.022995.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T10-47-40.022995.parquet'
- config_name: results
data_files:
- split: 2023_12_10T10_47_40.022995
path:
- results_2023-12-10T10-47-40.022995.parquet
- split: latest
path:
- results_2023-12-10T10-47-40.022995.parquet
---
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-llama-v2.0](https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:47:40.022995](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0/blob/main/results_2023-12-10T10-47-40.022995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43945994951550066,
"acc_stderr": 0.034385529407471936,
"acc_norm": 0.4442918982351828,
"acc_norm_stderr": 0.035190222707291795,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.3908033560283727,
"mc2_stderr": 0.013656125379191442
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.014600132075947087,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866978
},
"harness|hellaswag|10": {
"acc": 0.5789683330013942,
"acc_stderr": 0.0049271558825981845,
"acc_norm": 0.7775343557060347,
"acc_norm_stderr": 0.0041505226302310265
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02201908001221789,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02201908001221789
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.02815603653823321,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.02815603653823321
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4,
"acc_stderr": 0.024838811988033158,
"acc_norm": 0.4,
"acc_norm_stderr": 0.024838811988033158
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5853211009174312,
"acc_stderr": 0.021122903208602585,
"acc_norm": 0.5853211009174312,
"acc_norm_stderr": 0.021122903208602585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.02623287897149166,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.02623287897149166
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524867,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524867
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5443037974683544,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.5443037974683544,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436972,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436972
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.030463656747340275,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.030463656747340275
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6002554278416348,
"acc_stderr": 0.017516847907053282,
"acc_norm": 0.6002554278416348,
"acc_norm_stderr": 0.017516847907053282
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.028217683556652308,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.028217683556652308
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5030864197530864,
"acc_stderr": 0.027820214158594363,
"acc_norm": 0.5030864197530864,
"acc_norm_stderr": 0.027820214158594363
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.02788913930053478,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.02788913930053478
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32920469361147325,
"acc_stderr": 0.012002091666902297,
"acc_norm": 0.32920469361147325,
"acc_norm_stderr": 0.012002091666902297
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928554,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.363265306122449,
"acc_stderr": 0.03078905113903081,
"acc_norm": 0.363265306122449,
"acc_norm_stderr": 0.03078905113903081
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.0368713061556206,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.0368713061556206
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.3908033560283727,
"mc2_stderr": 0.013656125379191442
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440474
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683965
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
OmarAmir2001/my-image-dataset | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 11730860.0
num_examples: 119
download_size: 11636743
dataset_size: 11730860.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlekseyKorshuk/cup-it-ds-classification-small-2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4545195
num_examples: 7930
- name: validation
num_bytes: 1259443
num_examples: 2203
download_size: 3520634
dataset_size: 5804638
---
# Dataset Card for "cup-it-ds-classification-small-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Leon-LLM/Leon-Chess-Dataset-1M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 551374495
num_examples: 1028170
download_size: 282346024
dataset_size: 551374495
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Leon-Chess-Dataset-1M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nille1991/Leitliniendatenbank | ---
license: bigscience-openrail-m
language:
- de
size_categories:
- n<1K
---
In dieser Datenbank werden alle AWMF Leitlinien hinterlegt, die die Deutsche Gesellschaft für Orthopädie und Unfallchirurgie (DGOU) erstellt hat. |
reichenbach/arxiv_ppr_embeds | ---
annotations_creators:
- found
language:
- en
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: ScientificPapers
size_categories:
- 100K<n<1M
source_datasets:
- scientific_papers
task_categories:
- summarization
task_ids: []
paperswithcode_id: null
tags:
- abstractive-summarization
dataset_info:
features:
- name: article
dtype: string
- name: abstract
dtype: string
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 8367611540
num_examples: 203037
- name: validation
num_bytes: 256178362
num_examples: 6440
- name: test
num_bytes: 255771184
num_examples: 6436
download_size: 4718720913
dataset_size: 8879561086
---
# Dataset Card for "scientific_papers"
This dataset is derived from https://huggingface.co/datasets/scientific_papers with additional creation of embeddings via https://huggingface.co/docs/transformers/model_doc/rag for Natural Questions trained Base Model.
This dataset is created for purpose of Retrieval Augmented Generation examples and experiments.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/armancohan/long-summarization
- **Paper:** [A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents](https://arxiv.org/abs/1804.05685)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
Scientific papers datasets contains one sets of long and structured documents.
The datasets are obtained from ArXiv repositories.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### arxiv
- **Size of downloaded dataset files:** 4.50 GB
- **Size of the generated dataset:** 7.58 GB
- **Total amount of disk used:** 12.09 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"abstract": "\" we have studied the leptonic decay @xmath0 , via the decay channel @xmath1 , using a sample of tagged @xmath2 decays collected...",
"article": "\"the leptonic decays of a charged pseudoscalar meson @xmath7 are processes of the type @xmath8 , where @xmath9 , @xmath10 , or @...",
"section_names": "[sec:introduction]introduction\n[sec:detector]data and the cleo- detector\n[sec:analysys]analysis method\n[sec:conclusion]summary"
}
```
### Data Fields
The data fields are the same among all splits.
#### arxiv
- `article`: a `string` feature.
- `abstract`: a `string` feature.
- `section_names`: a `string` feature.
- `embeddings`: a `float` 768 dimensional vector
### Data Splits
| name |train |validation|test|
|------|-----:|---------:|---:|
|arxiv |203037| 6436|6440|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{Cohan_2018,
title={A Discourse-Aware Attention Model for Abstractive Summarization of
Long Documents},
url={http://dx.doi.org/10.18653/v1/n18-2097},
DOI={10.18653/v1/n18-2097},
journal={Proceedings of the 2018 Conference of the North American Chapter of
the Association for Computational Linguistics: Human Language
Technologies, Volume 2 (Short Papers)},
publisher={Association for Computational Linguistics},
author={Cohan, Arman and Dernoncourt, Franck and Kim, Doo Soon and Bui, Trung and Kim, Seokhwan and Chang, Walter and Goharian, Nazli},
year={2018}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@jplu](https://github.com/jplu), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora | ---
pretty_name: Evaluation run of JunchengXie/Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JunchengXie/Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora](https://huggingface.co/JunchengXie/Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T23:56:07.967023](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora/blob/main/results_2024-03-27T23-56-07.967023.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5452277178528238,\n\
\ \"acc_stderr\": 0.03410493442579154,\n \"acc_norm\": 0.5519072949752738,\n\
\ \"acc_norm_stderr\": 0.03485960483749285,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5680735045864234,\n\
\ \"mc2_stderr\": 0.01559042793493668\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5162116040955631,\n \"acc_stderr\": 0.014603708567414945,\n\
\ \"acc_norm\": 0.5366894197952219,\n \"acc_norm_stderr\": 0.01457200052775699\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5514837681736706,\n\
\ \"acc_stderr\": 0.004963259311700565,\n \"acc_norm\": 0.7358095996813384,\n\
\ \"acc_norm_stderr\": 0.004400000822742066\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874141,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874141\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332786,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332786\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.027218889773308753,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.027218889773308753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916645,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916645\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871916,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7137614678899082,\n \"acc_stderr\": 0.01937943662892,\n \"acc_norm\"\
: 0.7137614678899082,\n \"acc_norm_stderr\": 0.01937943662892\n },\n \
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n\
\ \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.04587904741301809,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.04587904741301809\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209818,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209818\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n\
\ \"acc_stderr\": 0.015594955384455773,\n \"acc_norm\": 0.7445721583652618,\n\
\ \"acc_norm_stderr\": 0.015594955384455773\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2212290502793296,\n\
\ \"acc_stderr\": 0.01388216459888727,\n \"acc_norm\": 0.2212290502793296,\n\
\ \"acc_norm_stderr\": 0.01388216459888727\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824096,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824096\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.02733954664066273,\n\
\ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.02733954664066273\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40547588005215124,\n\
\ \"acc_stderr\": 0.012539960672377204,\n \"acc_norm\": 0.40547588005215124,\n\
\ \"acc_norm_stderr\": 0.012539960672377204\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5326797385620915,\n \"acc_stderr\": 0.020184583359102202,\n \
\ \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.020184583359102202\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5680735045864234,\n\
\ \"mc2_stderr\": 0.01559042793493668\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7237569060773481,\n \"acc_stderr\": 0.01256681501569816\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.177407126611069,\n \
\ \"acc_stderr\": 0.010522533016890778\n }\n}\n```"
repo_url: https://huggingface.co/JunchengXie/Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|arc:challenge|25_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|gsm8k|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hellaswag|10_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-56-07.967023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T23-56-07.967023.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- '**/details_harness|winogrande|5_2024-03-27T23-56-07.967023.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T23-56-07.967023.parquet'
- config_name: results
data_files:
- split: 2024_03_27T23_56_07.967023
path:
- results_2024-03-27T23-56-07.967023.parquet
- split: latest
path:
- results_2024-03-27T23-56-07.967023.parquet
---
# Dataset Card for Evaluation run of JunchengXie/Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JunchengXie/Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora](https://huggingface.co/JunchengXie/Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T23:56:07.967023](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.1-gpt-4-80k-base_lora/blob/main/results_2024-03-27T23-56-07.967023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5452277178528238,
"acc_stderr": 0.03410493442579154,
"acc_norm": 0.5519072949752738,
"acc_norm_stderr": 0.03485960483749285,
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5680735045864234,
"mc2_stderr": 0.01559042793493668
},
"harness|arc:challenge|25": {
"acc": 0.5162116040955631,
"acc_stderr": 0.014603708567414945,
"acc_norm": 0.5366894197952219,
"acc_norm_stderr": 0.01457200052775699
},
"harness|hellaswag|10": {
"acc": 0.5514837681736706,
"acc_stderr": 0.004963259311700565,
"acc_norm": 0.7358095996813384,
"acc_norm_stderr": 0.004400000822742066
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874141,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874141
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504513,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504513
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332786,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308753,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916645,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916645
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871916,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.01937943662892,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.01937943662892
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.04587904741301809,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.04587904741301809
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209818,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209818
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.015594955384455773,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.015594955384455773
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2212290502793296,
"acc_stderr": 0.01388216459888727,
"acc_norm": 0.2212290502793296,
"acc_norm_stderr": 0.01388216459888727
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824096,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824096
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.02733954664066273,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.02733954664066273
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40547588005215124,
"acc_stderr": 0.012539960672377204,
"acc_norm": 0.40547588005215124,
"acc_norm_stderr": 0.012539960672377204
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.020184583359102202,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.020184583359102202
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555404,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555404
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5680735045864234,
"mc2_stderr": 0.01559042793493668
},
"harness|winogrande|5": {
"acc": 0.7237569060773481,
"acc_stderr": 0.01256681501569816
},
"harness|gsm8k|5": {
"acc": 0.177407126611069,
"acc_stderr": 0.010522533016890778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.