datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_46 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1174021704.0
num_examples: 230562
download_size: 1192362190
dataset_size: 1174021704.0
---
# Dataset Card for "chunk_46"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/semeval-task-8-b-v2-test-paraphrase-2 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: model
dtype: string
- name: source
dtype: string
- name: label
dtype: int64
- name: id
dtype: int64
- name: paraphrase
dtype: string
- name: paraphrase2
dtype: string
splits:
- name: test
num_bytes: 11109023
num_examples: 3000
download_size: 5184022
dataset_size: 11109023
---
# Dataset Card for "semeval-task-8-b-v2-test-paraphrase-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_60 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24513850800.875
num_examples: 255225
download_size: 21653620015
dataset_size: 24513850800.875
---
# Dataset Card for "chunk_60"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
visionlab/block-towers-10k-3s-trajectory-scale1 | ---
dataset_info:
- config_name: stack3_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 6144000
num_examples: 8000
- name: test
num_bytes: 1536000
num_examples: 2000
download_size: 772415
dataset_size: 7680000
- config_name: stack3_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 573216000
num_examples: 8000
- name: test
num_bytes: 143304000
num_examples: 2000
download_size: 357842807
dataset_size: 716520000
- config_name: stack4_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 7936000
num_examples: 8000
- name: test
num_bytes: 1984000
num_examples: 2000
download_size: 1082273
dataset_size: 9920000
- config_name: stack4_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 746848000
num_examples: 8000
- name: test
num_bytes: 186712000
num_examples: 2000
download_size: 535206285
dataset_size: 933560000
- config_name: stack5_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 9728000
num_examples: 8000
- name: test
num_bytes: 2432000
num_examples: 2000
download_size: 1395431
dataset_size: 12160000
- config_name: stack5_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 920480000
num_examples: 8000
- name: test
num_bytes: 230120000
num_examples: 2000
download_size: 704078782
dataset_size: 1150600000
- config_name: stack6_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 11520000
num_examples: 8000
- name: test
num_bytes: 2880000
num_examples: 2000
download_size: 1746742
dataset_size: 14400000
- config_name: stack6_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 1094112000
num_examples: 8000
- name: test
num_bytes: 273528000
num_examples: 2000
download_size: 877902271
dataset_size: 1367640000
configs:
- config_name: default
data_files:
- split: train
path: stack*/train-*
- split: test
path: stack*/test-*
- config_name: stack3
data_files:
- split: train
path: stack3*/train-*
- split: test
path: stack3*/test-*
- config_name: stack4
data_files:
- split: train
path: stack4*/train-*
- split: test
path: stack4*/test-*
- config_name: stack5
data_files:
- split: train
path: stack5*/train-*
- split: test
path: stack5*/test-*
- config_name: stack6
data_files:
- split: train
path: stack6*/train-*
- split: test
path: stack6*/test-*
- config_name: stack3_stable
data_files:
- split: train
path: stack3_stable/train-*
- split: test
path: stack3_stable/test-*
- config_name: stack3_unstable
data_files:
- split: train
path: stack3_unstable/train-*
- split: test
path: stack3_unstable/test-*
- config_name: stack4_stable
data_files:
- split: train
path: stack4_stable/train-*
- split: test
path: stack4_stable/test-*
- config_name: stack4_unstable
data_files:
- split: train
path: stack4_unstable/train-*
- split: test
path: stack4_unstable/test-*
- config_name: stack5_stable
data_files:
- split: train
path: stack5_stable/train-*
- split: test
path: stack5_stable/test-*
- config_name: stack5_unstable
data_files:
- split: train
path: stack5_unstable/train-*
- split: test
path: stack5_unstable/test-*
- config_name: stack6_stable
data_files:
- split: train
path: stack6_stable/train-*
- split: test
path: stack6_stable/test-*
- config_name: stack6_unstable
data_files:
- split: train
path: stack6_unstable/train-*
- split: test
path: stack6_unstable/test-*
---
|
Hiraishin/ujianjpj-test-a | ---
license: apache-2.0
---
|
DDSC/europarl | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- da
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: TwitterSent
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card for DKHate
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Direct Download**: http://danlp-downloads.alexandra.dk/datasets/europarl.sentiment2.zip
### Dataset Summary
This dataset consists of Danish data from the European Parliament that has been annotated for sentiment analysis by the [Alexandra Institute](https://github.com/alexandrainst) - all credits go to them.
### Supported Tasks and Leaderboards
This dataset is suitable for sentiment analysis.
### Languages
This dataset is in Danish.
## Dataset Structure
### Data Instances
Every entry in the dataset has a document and an associated label.
### Data Fields
An entry in the dataset consists of the following fields:
- `text` (`str`): The text content.
- `label` (`str`): The label of the `text`. Can be "positiv", "neutral" or "negativ" for positive, neutral and negative sentiment, respectively.
### Data Splits
A `train` and `test` split is available, with the test split being 30% of the dataset, randomly sampled in a stratified fashion. There are 669 documents in the training split and 288 in the test split.
## Additional Information
### Dataset Curators
The collection and annotation of the dataset is solely due to the [Alexandra Institute](https://github.com/alexandrainst).
### Licensing Information
The dataset is released under the CC BY 4.0 license.
### Citation Information
```
@misc{europarl,
title={EuroParl},
author={Alexandra Institute},
year={2020},
note={\url{https://danlp-alexandra.readthedocs.io/en/latest/docs/datasets.html#europarl-sentiment2}}
}
```
### Contributions
Thanks to [@saattrupdan](https://github.com/saattrupdan) for adding this dataset to the Hugging Face Hub. |
shidowake/FreedomIntelligence_alpaca-gpt4-japanese_subset_split_3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4863217.322740098
num_examples: 4997
download_size: 2566678
dataset_size: 4863217.322740098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-e1907042-7494829 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- clinc_oos
eval_info:
task: multi_class_classification
model: optimum/roberta-large-finetuned-clinc
metrics: []
dataset_name: clinc_oos
dataset_config: small
dataset_split: test
col_mapping:
text: text
target: intent
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: optimum/roberta-large-finetuned-clinc
* Dataset: clinc_oos
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
open-llm-leaderboard/details_NeuralNovel__Senzu-7B-v0.1-DPO | ---
pretty_name: Evaluation run of NeuralNovel/Senzu-7B-v0.1-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeuralNovel/Senzu-7B-v0.1-DPO](https://huggingface.co/NeuralNovel/Senzu-7B-v0.1-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Senzu-7B-v0.1-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T01:01:42.766922](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Senzu-7B-v0.1-DPO/blob/main/results_2024-03-01T01-01-42.766922.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6202862238005727,\n\
\ \"acc_stderr\": 0.032864673209060856,\n \"acc_norm\": 0.6257751511285696,\n\
\ \"acc_norm_stderr\": 0.033545183337135576,\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4528829753775133,\n\
\ \"mc2_stderr\": 0.01582449122934889\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955009,\n\
\ \"acc_norm\": 0.6672354948805461,\n \"acc_norm_stderr\": 0.013769863046192309\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6606253734315873,\n\
\ \"acc_stderr\": 0.00472529390522825,\n \"acc_norm\": 0.8433578968333001,\n\
\ \"acc_norm_stderr\": 0.0036272018740533913\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777474,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777474\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.03899073687357334,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.03899073687357334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n\
\ \"acc_stderr\": 0.025560604721022895,\n \"acc_norm\": 0.7193548387096774,\n\
\ \"acc_norm_stderr\": 0.025560604721022895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465715,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465715\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.01690927688493608,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.01690927688493608\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761983,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761983\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379772,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4528829753775133,\n\
\ \"mc2_stderr\": 0.01582449122934889\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205073\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3297952994692949,\n \
\ \"acc_stderr\": 0.012949955030571154\n }\n}\n```"
repo_url: https://huggingface.co/NeuralNovel/Senzu-7B-v0.1-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-01-42.766922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-01-42.766922.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- '**/details_harness|winogrande|5_2024-03-01T01-01-42.766922.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T01-01-42.766922.parquet'
- config_name: results
data_files:
- split: 2024_03_01T01_01_42.766922
path:
- results_2024-03-01T01-01-42.766922.parquet
- split: latest
path:
- results_2024-03-01T01-01-42.766922.parquet
---
# Dataset Card for Evaluation run of NeuralNovel/Senzu-7B-v0.1-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Senzu-7B-v0.1-DPO](https://huggingface.co/NeuralNovel/Senzu-7B-v0.1-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Senzu-7B-v0.1-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T01:01:42.766922](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Senzu-7B-v0.1-DPO/blob/main/results_2024-03-01T01-01-42.766922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6202862238005727,
"acc_stderr": 0.032864673209060856,
"acc_norm": 0.6257751511285696,
"acc_norm_stderr": 0.033545183337135576,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4528829753775133,
"mc2_stderr": 0.01582449122934889
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955009,
"acc_norm": 0.6672354948805461,
"acc_norm_stderr": 0.013769863046192309
},
"harness|hellaswag|10": {
"acc": 0.6606253734315873,
"acc_stderr": 0.00472529390522825,
"acc_norm": 0.8433578968333001,
"acc_norm_stderr": 0.0036272018740533913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777474,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357334,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022895,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465715,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465715
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.01690927688493608,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.01690927688493608
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761983,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761983
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379772,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4528829753775133,
"mc2_stderr": 0.01582449122934889
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205073
},
"harness|gsm8k|5": {
"acc": 0.3297952994692949,
"acc_stderr": 0.012949955030571154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TalTechNLP/instructionSum | ---
license: cc-by-4.0
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3680328493
num_examples: 510624
download_size: 2177598966
dataset_size: 3680328493
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
presencesw/dataset1_translated_END | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: references
sequence: string
- name: question_vi
dtype: string
- name: answer_vi
dtype: string
- name: references_vi
sequence: string
splits:
- name: train
num_bytes: 82049546
num_examples: 13500
download_size: 42287221
dataset_size: 82049546
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Japanese_Pronunciation_Dictionary | ---
task_categories:
- automatic-speech-recognition
language:
- ja
---
# Dataset Card for Nexdata/Japanese_Pronunciation_Dictionary
## Description
The data contains 101,702 entries. All words and pronunciations are produced by Japanese linguists. It can be used in the research and development of Japanese ASR technology.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1088?source=Huggingface
# Specifications
## Format
TXT
## Data content
101,702 Japanese words and corresponding hiragana characters
## Language
Japanese
## Application scenario
speech recognition
# Licensing Information
Commercial License |
atmallen/qm_bob_hard_4_mixture_1.0e | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 4578170.5
num_examples: 37091
- name: validation
num_bytes: 487083.5
num_examples: 3969
- name: test
num_bytes: 477119.5
num_examples: 3926
download_size: 1539574
dataset_size: 5542373.5
---
# Dataset Card for "qm_bob_hard_4_mixture_1.0e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kkkkpojjhh/liniker | ---
license: apache-2.0
---
|
JeisonJA/CSV_TRAIN_FORMAT | ---
license: apache-2.0
---
|
DavidVivancos/MindBigData2023_MNIST-8B | ---
license: odbl
---
## Dataset Summary
MindBigData 2023 MNIST-8B is the largest, to date (June 1st 2023), brain signals open dataset created for Machine Learning, based on EEG signals from a single subject captured using a custom 128 channels device, replicating the full 70,000 digits from Yaan LeCun et all MNIST dataset. The brain signals were captured while the subject was watching the pixels of the original digits one by one on a screen and listening at the same time to the spoken number 0 to 9 from the real label.
Supporting dataset for paper https://arxiv.org/abs/2306.00455
The dataset contains 140,000 records from 128 EEG channels, each of 2 seconds, recorded at 250hz, in total 17,920,000 brain signals and 8,960,000,000 data points.
It consists of 2 main csv data files:
- “train.csv” 45Gb Header + 120,000 rows 64,791 columns
- “test.csv” 7,52Gb Header + 20,000 rows 64,791 columns
10 audio files at a folder named “audiolabels”: “0.wav”, “1.wav”......“9.wav”
And 1 csv file with 3d coordinates of the EEG electrodes: “3Dcoords.csv” 4,27Kb Header + 130 rows 4 columns
>update July 18th 2023: As requested a reduced 2Billion datapoints is released https://huggingface.co/datasets/DavidVivancos/MindBigData2023_MNIST-2B
## Dataset Structure
review supporting paper https://arxiv.org/abs/2306.00455
## Data Fields
review supporting paper https://arxiv.org/abs/2306.00455
## Citation
```sh
@article{MindBigData_2023_MNIST-8B,
title={MindBigData 2023 MNIST-8B The 8 billion datapoints Multimodal Dataset of Brain Signals},
author={David Vivancos},
journal={arXiv preprint arXiv:2306.00455},
year={2023}
}
``` |
liuyanchen1015/MULTI_VALUE_stsb_quotative_like | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 984
num_examples: 4
- name: test
num_bytes: 624
num_examples: 2
- name: train
num_bytes: 3093
num_examples: 15
download_size: 13581
dataset_size: 4701
---
# Dataset Card for "MULTI_VALUE_stsb_quotative_like"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zouharvi/optimal-reference-translations | ---
license: cc
configs:
- config_name: ort_human
data_files: ort_human.json
- config_name: ort_wmt
data_files: ort_wmt.json
default: true
task_categories:
- translation
language:
- cs
- en
tags:
- quality
- human_translation
- evaluation
pretty_name: Optimal Reference Translations
size_categories:
- 1K<n<10K
---
This is the dataset for two papers: **Quality and Quantity of Machine Translation References for Automated Metrics [[paper](https://arxiv.org/abs/2401.01283)]** - effect of reference quality and quantity on automatic metric performance, and **Evaluating Optimal Reference Translations [[paper]](https://arxiv.org/abs/2311.16787)** - creation of the data and human aspects of annotation and translation.
Please see the [original repository](https://github.com/ufal/optimal-reference-translations) for more information and the raw data or [contact the authors](mailto:vilem.zouhar@gmail.com) with any questions.
Please make sure that you have the latest `datasets` installed:
```
data_human = load_dataset("zouharvi/optimal-reference-translations", 'ort_human')["train"]
data_wmt = load_dataset("zouharvi/optimal-reference-translations", 'ort_wmt')["train"]
```
# Quality and Quantity of Machine Translation References for Automated Metrics [[paper](https://arxiv.org/abs/2401.01283)]
> **Abstract:** Automatic machine translation metrics often use _human_ translations to determine the quality _system_ translations. Common wisdom in the field dictates that the human references should be of very high quality. However, there are no cost-benefit analyses that could be used to guide practitioners who plan to collect references for machine translation evaluation. We find that higher-quality references lead to better metric correlations with humans at the segment-level. Having up to 7 references per segment and taking their average helps. Interestingly, the references from vendors of different qualities can be mixed together and improve metric success. Higher quality references, however, cost more to create and we frame this as an optimization problem: given a specific budget, what types of references should be collected to maximize metric success. These findings can be used by evaluators of shared tasks when references need to be created under a certain budget.
Cite [this paper](https://arxiv.org/abs/2401.01283) as:
```
@misc{zouhar2024quality,
title={Quality and Quantity of Machine Translation References for Automated Metrics},
author={Vilém Zouhar and Ondřej Bojar},
year={2024},
eprint={2401.01283},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
# Evaluating Optimal Reference Translations [[paper]](https://arxiv.org/abs/2311.16787)
> **Abstract:** The overall translation quality reached by current machine translation (MT) systems for high-resourced language pairs is remarkably good. Standard methods of evaluation are not suitable nor intended to uncover the many translation errors and quality deficiencies that still persist. Furthermore, the quality of standard reference translations is commonly questioned and comparable quality levels have been reached by MT alone in several language pairs. Navigating further research in these high-resource settings is thus difficult. In this article, we propose a methodology for creating more reliable document-level human reference translations, called "optimal reference translations," with the simple aim to raise the bar of what should be deemed "human translation quality." We evaluate the obtained document-level optimal reference translations in comparison with "standard" ones, confirming a significant quality increase and also documenting the relationship between evaluation and translation editing.
This is project at ETH Zürich and ÚFAL Charles University. [Paper](https://arxiv.org/abs/2311.16787) to be published in Natural Language Engineering 2024.
For now cite as:
```
@misc{zouhar2023evaluating,
title={Evaluating Optimal Reference Translations},
author={Vilém Zouhar and Věra Kloudová and Martin Popel and Ondřej Bojar},
year={2023},
eprint={2311.16787},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
Collected English to Czech translation evaluation human data are in [`data/ort_human.json`](data/ort_human.json). The rest of this repository contains data preparation and evaluation code.
Our data is based on WMT2020 data and can thus be also used to e.g. evaluate the quality of various translations as references.
The process of the data is as follows:
1. R1, R2, and R3 are independent translations from English to Czech. R4 is an expert translation by a translatologist.
2. All the human translations are evaluated on document and segment level with detail (in [`data/ort_human.json`](data/ort_human.json)) by different types of human annotators (laypeople, translatology students, professional translators). If the translation is not perfect, the annotators provide a post-edited version for which they would assign the highest grade (6).
Note: If you you also want to use the WMT2020 system submissions, please contact [Vilém Zouhar](vilem.zouhar@gmail.com). The code is here, just not pretty yet. 🙂
## Example usage
```python3
from datasets import load_dataset
data = load_dataset("zouharvi/optimal-reference-translations", 'ort_human')["train"]
# 220 annotated documents
len(data)
# 1760 annotated source lines
sum([len(doc["lines"]) for doc in data])
# 7040 annotated translations
sum([sum([len(line["translations"]) for line in doc["lines"]]) for doc in data])
# 11 annotators
len(set(doc["uid"] for doc in data))
import numpy as np
# Average document-level for R4: 5.865
np.average([doc["rating"]["4"]["overall"] for doc in data])
# Average document-level for R3: 4.810
np.average([doc["rating"]["3"]["overall"] for doc in data])
```
## Data structure
Beginning of `ort_wmt` (human evaluation of multiple WMT systems):
```
[
{
"src": "The government has compulsorily retired 15 more tax officers in the fourth tranche of its crackdown on errant officials accused of corruption and other malpractices.",
"systems": {
"Online-Z.1630": {
"tgt": "Vláda povinně vysloužila 15 dalších daňových úředníků ve čtvrtém tranši svého zásahu proti potulným úředníkům obviněným z korupce a dalších zanedbaných praktik.",
"score": 0.11005665393442401
},
"SRPOL.522": {
"tgt": "Vláda povinně odvolala dalších 15 daňových úředníků ve čtvrté části svého zásahu proti chybujícím úředníkům obviněným z korupce a dalších nekalých praktik.",
"score": -0.13950326931169
},
"CUNI-DocTransformer.1450": {
"tgt": "Vláda poslala do penze dalších 15 daňových úředníků ve čtvrté části svého zátahu proti zatoulaným úředníkům obviněným z korupce a dalších nekalých praktik.",
"score": 0.768625609971165
},
"Online-G.1555": {
"tgt": "Vláda nuceně poslala do důchodu dalších 15 daňových úředníků ve čtvrté části svého zákroku proti zbloudilým úředníkům obviněným z korupce a jiných pochybení.",
"score": 0.13307070461983803
},
"UEDIN-CUNI.1482": {
"tgt": "Vláda povinně odvolala dalších 15 daňových úředníků ve čtvrté tranši zátahu proti potulným úředníkům obviněným z korupce a dalších nekalých praktik.",
"score": 0
},
"Online-B.1589": {
"tgt": "Vláda povinně odešla do důchodu dalších 15 daňových úředníků ve čtvrté tranši svého zákroku proti errancujícím úředníkům obviněným z korupce a jiných nezákonných praktik.",
"score": -0.961245691263453
},
"CUNI-Transformer.1080": {
"tgt": "Vláda nuceně propustila dalších 15 daňových úředníků ve čtvrté tranši svého zátahu proti chybujícím úředníkům obviněným z korupce a dalších nekalých praktik.",
"score": 0.33815971885602303
},
"Online-A.1573": {
"tgt": "Vláda povinně odešel 15 více daňových úředníků ve čtvrté tranši svého zákroku proti potulný úředníků obviněných z korupce a dalších nekalých praktik.",
"score": -0.46360315621142206
},
"ref": {
"tgt": "Vláda přikázala odchod do důchodu dalším 15 daňovým úředníkům v rámci čtvrtého balíčku opatření proti úředníkům obviněným z korupce a dalších nezákonných praktik.",
"score": 0.282610797572385
},
"CUNI-T2T-2018.1071": {
"tgt": "Vláda ve čtvrté tranši svého zátahu proti pochybným úředníkům obviněným z korupce a dalších nekalých praktik povinně odvolala dalších 15 daňových úředníků.",
"score": -0.28717921591702694
},
"eTranslation.1048": {
"tgt": "Vláda ve čtvrté tranši svého zákroku proti chybujícím úředníkům obviněným z korupce a dalších nekalých praktik nuceně odvolala dalších 15 daňových úředníků.",
"score": 0.444703520638052
},
"OPPO.1121": {
"tgt": "Vláda ve čtvrté tranši svého zásahu proti zbloudilým úředníkům obviněným z korupce a dalších nekalých praktik nuceně propustila dalších 15 daňových úředníků.",
"score": 1.9093879205480302
},
"zlabs-nlp.1151": {
"tgt": "Vláda povinně odešel do důchodu 15 dalších daňových důstojníků ve čtvrtém tranši jeho praskání na vymazané úředníky obviněné z korupce a dalších malpraktices.",
"score": -1.20533986295174
}
},
"ref": {
"R1": "Vláda přikázala odchod do důchodu dalším 15 daňovým úředníkům v rámci čtvrtého balíčku opatření proti úředníkům obviněným z korupce a dalších nezákonných praktik.",
"R1_pe_student_sahara": "Indická vláda přikázala odchod do důchodu dalším patnácti daňovým úředníkům v rámci čtvrtého balíčku opatření proti úředníkům obviněným z korupce a dalších nezákonných praktik.",
"R2": "Vláda povinně poslala do důchodu dalších 15 daňových úředníků ve čtvrté vlně svého zásahu proti špatným úředníkům obviněným z korupce a dalších profesních pochybení.",
"R2_pe_student_sahara": "Indická vláda ve čtvrté vlně svého zásahu proti úředníkům obviněným z korupce a dalších profesních pochybení poslala dalších patnáct daňových úředníků do povinného důchodu.",
"R3": "Vláda odvolala 15 dalších daňových úředníků ve čtvrté tranši stíhání vykyvujících úředníků obviněných z korupce a dalších nezákonných praktik.",
...
```
Beginning of `ort_human` (human evaluation of multiple human translations):
```
[
{
"uid": "sahara",
"expertise": "student",
"doc": "huffingtonpost.com.19385",
"time": 210.0, # self-reported in minutes
"rating": {
"2": { # 2 = P2
"spelling": 4.0, # ranges from 0 to 6
"terminology": 5.5,
"grammar": 5.5,
"meaning": 5.0,
"style": 4.5,
"pragmatics": 6.0,
"overall": 4.5
},
"4": { # 4 = N1
"spelling": 6.0,
"terminology": 6.0,
"grammar": 6.0,
"meaning": 5.0,
"style": 5.0,
"pragmatics": 6.0,
"overall": 5.7
},
"1": { # 1 = P1
"spelling": 6.0,
"terminology": 5.9,
"grammar": 5.4,
"meaning": 4.7,
"style": 4.6,
"pragmatics": 5.8,
"overall": 5.0
},
"3": { # 3 = P3
"spelling": 4.5,
"terminology": 4.7,
"grammar": 5.0,
"meaning": 4.5,
"style": 5.0,
"pragmatics": 6.0,
"overall": 4.6
}
},
"lines": [
{
"source": "Sony, Disney Back To Work On Third Spider-Man Film", # source sentence
"comment": null,
"translations": {
"2": {
"orig": "Sony a Disney opět pracují na třetím filmu o Spider-Manovi", # original translation
"done": "Sony a Disney pracují na třetím filmu o Spider-Manovi", # post-edited translation
"rating": {
"spelling": 6.0,
"terminology": 6.0,
"grammar": 6.0,
"meaning": 5.0,
"style": 6.0,
"pragmatics": 6.0,
"overall": 5.0
}
},
"4": {
"orig": "Sony a Disney opět spolupracují na třetím filmu o Spider-Manovi",
"done": "Sony a Disney opět spolupracují na třetím filmu o Spider-Manovi",
"rating": {
"spelling": 6.0,
"terminology": 6.0,
"grammar": 6.0,
"meaning": 6.0,
"style": 6.0,
"pragmatics": 6.0,
"overall": 6.0
}
},
...
``` |
JayalekshmiGopakumar/DocLayexp1 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': financial_reports
'1': government_tenders
'2': manuals
'3': laws_and_regulations
'4': scientific_articles
'5': patents
- name: ground_truth
dtype: string
splits:
- name: test
num_bytes: 3240643.0
num_examples: 12
- name: train
num_bytes: 16492390.0
num_examples: 43
- name: validation
num_bytes: 1929905.3125
num_examples: 5
download_size: 21721061
dataset_size: 21662938.3125
---
# Dataset Card for "DocLayexp1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo16_2_mix_50_kl_0.1_prm_160m_thr_1.0_seed_3 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43778026
num_examples: 18928
- name: epoch_1
num_bytes: 44414700
num_examples: 18928
- name: epoch_2
num_bytes: 44463459
num_examples: 18928
- name: epoch_3
num_bytes: 44504519
num_examples: 18928
- name: epoch_4
num_bytes: 44524186
num_examples: 18928
- name: epoch_5
num_bytes: 44508265
num_examples: 18928
- name: epoch_6
num_bytes: 44494966
num_examples: 18928
- name: epoch_7
num_bytes: 44479117
num_examples: 18928
- name: epoch_8
num_bytes: 44471722
num_examples: 18928
- name: epoch_9
num_bytes: 44465380
num_examples: 18928
- name: epoch_10
num_bytes: 44460120
num_examples: 18928
- name: epoch_11
num_bytes: 44461044
num_examples: 18928
- name: epoch_12
num_bytes: 44459111
num_examples: 18928
- name: epoch_13
num_bytes: 44454936
num_examples: 18928
- name: epoch_14
num_bytes: 44455185
num_examples: 18928
- name: epoch_15
num_bytes: 44457591
num_examples: 18928
- name: epoch_16
num_bytes: 44456363
num_examples: 18928
- name: epoch_17
num_bytes: 44457557
num_examples: 18928
- name: epoch_18
num_bytes: 44460508
num_examples: 18928
- name: epoch_19
num_bytes: 44460626
num_examples: 18928
- name: epoch_20
num_bytes: 44458783
num_examples: 18928
- name: epoch_21
num_bytes: 44459668
num_examples: 18928
- name: epoch_22
num_bytes: 44459777
num_examples: 18928
- name: epoch_23
num_bytes: 44459779
num_examples: 18928
- name: epoch_24
num_bytes: 44458014
num_examples: 18928
- name: epoch_25
num_bytes: 44460051
num_examples: 18928
- name: epoch_26
num_bytes: 44459532
num_examples: 18928
- name: epoch_27
num_bytes: 44458956
num_examples: 18928
- name: epoch_28
num_bytes: 44458160
num_examples: 18928
- name: epoch_29
num_bytes: 44458407
num_examples: 18928
download_size: 701402776
dataset_size: 1333278508
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
open-llm-leaderboard/details_abhishek__autotrain-ixpiv-6kj1e | ---
pretty_name: Evaluation run of abhishek/autotrain-ixpiv-6kj1e
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishek/autotrain-ixpiv-6kj1e](https://huggingface.co/abhishek/autotrain-ixpiv-6kj1e)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishek__autotrain-ixpiv-6kj1e\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T22:05:43.621226](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-ixpiv-6kj1e/blob/main/results_2024-03-21T22-05-43.621226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5805069377691062,\n\
\ \"acc_stderr\": 0.033161302516735554,\n \"acc_norm\": 0.5907084167098668,\n\
\ \"acc_norm_stderr\": 0.03406273085391777,\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.45718481969610947,\n\
\ \"mc2_stderr\": 0.015494724297586289\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403077,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672881\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657837084246166,\n\
\ \"acc_stderr\": 0.004734642167493349,\n \"acc_norm\": 0.8254331806413066,\n\
\ \"acc_norm_stderr\": 0.0037882037293467024\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779205,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671753,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391243,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391243\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164525,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164525\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246122,\n \
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246122\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.018175110510343574,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.018175110510343574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834277,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834277\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905713,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905713\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.02590663263101613,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.02590663263101613\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.015721531075183877,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.015721531075183877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.02753007844711031,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.02753007844711031\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.025910063528240875,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.025910063528240875\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41590612777053454,\n\
\ \"acc_stderr\": 0.012588323850313622,\n \"acc_norm\": 0.41590612777053454,\n\
\ \"acc_norm_stderr\": 0.012588323850313622\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.01991037746310594,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.01991037746310594\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.45718481969610947,\n\
\ \"mc2_stderr\": 0.015494724297586289\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.01198854184484391\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/abhishek/autotrain-ixpiv-6kj1e
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|arc:challenge|25_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|gsm8k|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hellaswag|10_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-05-43.621226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T22-05-43.621226.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- '**/details_harness|winogrande|5_2024-03-21T22-05-43.621226.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T22-05-43.621226.parquet'
- config_name: results
data_files:
- split: 2024_03_21T22_05_43.621226
path:
- results_2024-03-21T22-05-43.621226.parquet
- split: latest
path:
- results_2024-03-21T22-05-43.621226.parquet
---
# Dataset Card for Evaluation run of abhishek/autotrain-ixpiv-6kj1e
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishek/autotrain-ixpiv-6kj1e](https://huggingface.co/abhishek/autotrain-ixpiv-6kj1e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishek__autotrain-ixpiv-6kj1e",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T22:05:43.621226](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-ixpiv-6kj1e/blob/main/results_2024-03-21T22-05-43.621226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5805069377691062,
"acc_stderr": 0.033161302516735554,
"acc_norm": 0.5907084167098668,
"acc_norm_stderr": 0.03406273085391777,
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.45718481969610947,
"mc2_stderr": 0.015494724297586289
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403077,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672881
},
"harness|hellaswag|10": {
"acc": 0.657837084246166,
"acc_stderr": 0.004734642167493349,
"acc_norm": 0.8254331806413066,
"acc_norm_stderr": 0.0037882037293467024
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779205,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671753,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391243,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391243
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164525,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164525
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.018175110510343574,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.018175110510343574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834277,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834277
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905713,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905713
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.02590663263101613,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.02590663263101613
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.015721531075183877,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.015721531075183877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.02753007844711031,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.02753007844711031
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.025910063528240875,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.025910063528240875
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41590612777053454,
"acc_stderr": 0.012588323850313622,
"acc_norm": 0.41590612777053454,
"acc_norm_stderr": 0.012588323850313622
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.01991037746310594,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.01991037746310594
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.45718481969610947,
"mc2_stderr": 0.015494724297586289
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.01198854184484391
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chiyuanhsiao/HowFarAreYou_3DSpeakerTrain | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 1875822993.8165143
num_examples: 4284
- name: validation
num_bytes: 212625552.76748583
num_examples: 477
download_size: 2055764378
dataset_size: 2088448546.584
---
# Dataset Card for "HowFarAreYou_3DSpeakerTrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ZySec-AI__ZySec-7B | ---
pretty_name: Evaluation run of ZySec-AI/ZySec-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZySec-AI/ZySec-7B](https://huggingface.co/ZySec-AI/ZySec-7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZySec-AI__ZySec-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T02:20:54.183750](https://huggingface.co/datasets/open-llm-leaderboard/details_ZySec-AI__ZySec-7B/blob/main/results_2024-03-22T02-20-54.183750.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5833427561467389,\n\
\ \"acc_stderr\": 0.03342813901320813,\n \"acc_norm\": 0.5898847098167896,\n\
\ \"acc_norm_stderr\": 0.034124395634022184,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5111163939897228,\n\
\ \"mc2_stderr\": 0.015418045555863789\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5204778156996587,\n \"acc_stderr\": 0.01459913135303501,\n\
\ \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520769\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5978888667596096,\n\
\ \"acc_stderr\": 0.0048932206350117925,\n \"acc_norm\": 0.7972515435172276,\n\
\ \"acc_norm_stderr\": 0.004012249939174913\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887249,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887249\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042778,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042778\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059274,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059274\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790222,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790222\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.03182231867647553,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.03182231867647553\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.01504630184669181,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.01504630184669181\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531015,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531015\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n\
\ \"acc_stderr\": 0.015334566806251159,\n \"acc_norm\": 0.3005586592178771,\n\
\ \"acc_norm_stderr\": 0.015334566806251159\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110314,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110314\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934016,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934016\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677885,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677885\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854922,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854922\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.02993534270787774,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.02993534270787774\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.576797385620915,\n \"acc_stderr\": 0.019987809769482064,\n \
\ \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.019987809769482064\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328913,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653697,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653697\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5111163939897228,\n\
\ \"mc2_stderr\": 0.015418045555863789\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2896133434420015,\n \
\ \"acc_stderr\": 0.012493927348659629\n }\n}\n```"
repo_url: https://huggingface.co/ZySec-AI/ZySec-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|arc:challenge|25_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|gsm8k|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hellaswag|10_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-20-54.183750.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T02-20-54.183750.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- '**/details_harness|winogrande|5_2024-03-22T02-20-54.183750.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T02-20-54.183750.parquet'
- config_name: results
data_files:
- split: 2024_03_22T02_20_54.183750
path:
- results_2024-03-22T02-20-54.183750.parquet
- split: latest
path:
- results_2024-03-22T02-20-54.183750.parquet
---
# Dataset Card for Evaluation run of ZySec-AI/ZySec-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZySec-AI/ZySec-7B](https://huggingface.co/ZySec-AI/ZySec-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZySec-AI__ZySec-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T02:20:54.183750](https://huggingface.co/datasets/open-llm-leaderboard/details_ZySec-AI__ZySec-7B/blob/main/results_2024-03-22T02-20-54.183750.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5833427561467389,
"acc_stderr": 0.03342813901320813,
"acc_norm": 0.5898847098167896,
"acc_norm_stderr": 0.034124395634022184,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.5111163939897228,
"mc2_stderr": 0.015418045555863789
},
"harness|arc:challenge|25": {
"acc": 0.5204778156996587,
"acc_stderr": 0.01459913135303501,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520769
},
"harness|hellaswag|10": {
"acc": 0.5978888667596096,
"acc_stderr": 0.0048932206350117925,
"acc_norm": 0.7972515435172276,
"acc_norm_stderr": 0.004012249939174913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887249,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887249
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042778,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042778
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059274,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059274
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.01504630184669181,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.01504630184669181
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531015,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3005586592178771,
"acc_stderr": 0.015334566806251159,
"acc_norm": 0.3005586592178771,
"acc_norm_stderr": 0.015334566806251159
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110314,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110314
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934016,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934016
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862744,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862744
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677885,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677885
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854922,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854922
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.02993534270787774,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.02993534270787774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.019987809769482064,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.019987809769482064
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328913,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.5111163939897228,
"mc2_stderr": 0.015418045555863789
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
},
"harness|gsm8k|5": {
"acc": 0.2896133434420015,
"acc_stderr": 0.012493927348659629
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MohamedExperio/rvlTest_mini2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 38285866.0
num_examples: 320
- name: validation
num_bytes: 36823102.0
num_examples: 320
- name: test
num_bytes: 37125936.0
num_examples: 320
download_size: 105373939
dataset_size: 112234904.0
---
# Dataset Card for "rvlTest_mini2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bzh-dataset/Korpus-frazennou-brezhonek | ---
language:
- fr
- br
license: unknown
configs:
- config_name: corpus
data_files: Korpus-frazennou-brezhonek.csv
sep: ;
---
# Korpus-frazennou-brezhonek
Corpus de 4532 phrases bilingues (français-breton) alignées et libres de droits provenant de l'Office Public de la Langue Bretonne.
Plus d'informations [ici](https://www.fr.brezhoneg.bzh/212-donnees-libres-de-droits.htm).
# Usage
```
from datasets import load_dataset
dataset = load_dataset("bzh-dataset/Korpus-frazennou-brezhonek")
```
|
tyzhu/find_sent_after_sent_train_400_eval_40_random_permute_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 3057558.7870563674
num_examples: 2434
- name: validation
num_bytes: 232483
num_examples: 200
download_size: 1040869
dataset_size: 3290041.7870563674
---
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Edopangui/promociones2 | ---
license: apache-2.0
---
|
diabolic6045/Images-of-Top-Indian-Cities | ---
license: apache-2.0
task_categories:
- image-classification
tags:
- India
- Cities
- Ahmedabad
- Delhi
- Kolkata
- Mumbai
- Kerala
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
Includes Images for different Indian Cities.
## Dataset Details
Each city has 2500 images
### Dataset Description
This dataset contains 2500 images per Cities of popular indian Cities, City included are Ahmendabad, Mumbai, Delhi, Koklakta and A state Kerala.
- **Curated by:** Divax Shah and Team
### Dataset Sources
Google
- **Demo:** [here](https://location-classification-of-indian-cities.streamlit.app/)
arXiv : https://arxiv.org/abs/2403.10912 |
Inline/grobid | ---
license: apache-2.0
---
|
bellagio-ai/t2i-vietnam-pictures | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 26817348.0
num_examples: 81
download_size: 26664289
dataset_size: 26817348.0
---
# Dataset Card for "t2i-vietnam-pictures"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anyspeech/PhoneCorpus | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: phones
dtype: string
splits:
- name: train
num_bytes: 264095984
num_examples: 10382114
download_size: 143568761
dataset_size: 264095984
---
# Dataset Card for "PhoneCorpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/trec-cast_v1_2020 | ---
pretty_name: '`trec-cast/v1/2020`'
viewer: false
source_datasets: ['irds/trec-cast_v1']
task_categories:
- text-retrieval
---
# Dataset Card for `trec-cast/v1/2020`
The `trec-cast/v1/2020` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/trec-cast#trec-cast/v1/2020).
# Data
This dataset provides:
- `queries` (i.e., topics); count=216
- `qrels`: (relevance assessments); count=40,451
- For `docs`, use [`irds/trec-cast_v1`](https://huggingface.co/datasets/irds/trec-cast_v1)
This dataset is used by: [`trec-cast_v1_2020_judged`](https://huggingface.co/datasets/irds/trec-cast_v1_2020_judged)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/trec-cast_v1_2020', 'queries')
for record in queries:
record # {'query_id': ..., 'raw_utterance': ..., 'automatic_rewritten_utterance': ..., 'manual_rewritten_utterance': ..., 'manual_canonical_result_id': ..., 'topic_number': ..., 'turn_number': ...}
qrels = load_dataset('irds/trec-cast_v1_2020', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Dalton2020Cast,
title={CAsT 2020: The Conversational Assistance Track Overview},
author={Jeffrey Dalton and Chenyan Xiong and Jamie Callan},
booktitle={TREC},
year={2020}
}
```
|
jw0303/20230110 | ---
license: apache-2.0
---
|
buddhist-nlp/daizhige-masked | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9403385615
num_examples: 24759486
- name: val
num_bytes: 957032
num_examples: 2500
- name: test
num_bytes: 919749
num_examples: 2500
download_size: 2899144100
dataset_size: 9405262396
---
# Dataset Card for "daizhige-masked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/rbrt_lrg_trn | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 418128930
num_examples: 339120
download_size: 121309096
dataset_size: 418128930
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rbrt_lrg_trn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_FelixChao__MathDolphin-7B | ---
pretty_name: Evaluation run of FelixChao/MathDolphin-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/MathDolphin-7B](https://huggingface.co/FelixChao/MathDolphin-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__MathDolphin-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T13:48:07.624647](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__MathDolphin-7B/blob/main/results_2024-01-14T13-48-07.624647.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65315875756261,\n\
\ \"acc_stderr\": 0.03196302131707709,\n \"acc_norm\": 0.6538202133223454,\n\
\ \"acc_norm_stderr\": 0.03261743110455684,\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5291514968771067,\n\
\ \"mc2_stderr\": 0.015285199336849235\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.01385583128749773\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6622186815375424,\n\
\ \"acc_stderr\": 0.004719870074967248,\n \"acc_norm\": 0.8549093806014738,\n\
\ \"acc_norm_stderr\": 0.0035147239847366034\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469553,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469553\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223168,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223168\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n\
\ \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.01627792703963819,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.01627792703963819\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451152,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451152\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533133,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5291514968771067,\n\
\ \"mc2_stderr\": 0.015285199336849235\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \
\ \"acc_stderr\": 0.012864471384836703\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/MathDolphin-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|arc:challenge|25_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|gsm8k|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hellaswag|10_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T13-48-07.624647.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- '**/details_harness|winogrande|5_2024-01-14T13-48-07.624647.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T13-48-07.624647.parquet'
- config_name: results
data_files:
- split: 2024_01_14T13_48_07.624647
path:
- results_2024-01-14T13-48-07.624647.parquet
- split: latest
path:
- results_2024-01-14T13-48-07.624647.parquet
---
# Dataset Card for Evaluation run of FelixChao/MathDolphin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/MathDolphin-7B](https://huggingface.co/FelixChao/MathDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__MathDolphin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T13:48:07.624647](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__MathDolphin-7B/blob/main/results_2024-01-14T13-48-07.624647.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.65315875756261,
"acc_stderr": 0.03196302131707709,
"acc_norm": 0.6538202133223454,
"acc_norm_stderr": 0.03261743110455684,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5291514968771067,
"mc2_stderr": 0.015285199336849235
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.014124597881844461,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.01385583128749773
},
"harness|hellaswag|10": {
"acc": 0.6622186815375424,
"acc_stderr": 0.004719870074967248,
"acc_norm": 0.8549093806014738,
"acc_norm_stderr": 0.0035147239847366034
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469553,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469553
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223168,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223168
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.01627792703963819,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.01627792703963819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451152,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451152
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533133,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5291514968771067,
"mc2_stderr": 0.015285199336849235
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048 | ---
pretty_name: Evaluation run of harborwater/open-llama-3b-everythingLM-2048
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [harborwater/open-llama-3b-everythingLM-2048](https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T01:01:11.414021](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048/blob/main/results_2023-10-24T01-01-11.414021.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.00039210421902986076,\n \"f1\": 0.053537122483221615,\n\
\ \"f1_stderr\": 0.0012879336042021898,\n \"acc\": 0.3390732138444075,\n\
\ \"acc_stderr\": 0.008325489359560807\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902986076,\n\
\ \"f1\": 0.053537122483221615,\n \"f1_stderr\": 0.0012879336042021898\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \
\ \"acc_stderr\": 0.003366022949726365\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6629834254143646,\n \"acc_stderr\": 0.01328495576939525\n\
\ }\n}\n```"
repo_url: https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|arc:challenge|25_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T01_01_11.414021
path:
- '**/details_harness|drop|3_2023-10-24T01-01-11.414021.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T01-01-11.414021.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T01_01_11.414021
path:
- '**/details_harness|gsm8k|5_2023-10-24T01-01-11.414021.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T01-01-11.414021.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hellaswag|10_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T08-05-25.924210.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T08-05-25.924210.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T01_01_11.414021
path:
- '**/details_harness|winogrande|5_2023-10-24T01-01-11.414021.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T01-01-11.414021.parquet'
- config_name: results
data_files:
- split: 2023_10_04T08_05_25.924210
path:
- results_2023-10-04T08-05-25.924210.parquet
- split: 2023_10_24T01_01_11.414021
path:
- results_2023-10-24T01-01-11.414021.parquet
- split: latest
path:
- results_2023-10-24T01-01-11.414021.parquet
---
# Dataset Card for Evaluation run of harborwater/open-llama-3b-everythingLM-2048
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [harborwater/open-llama-3b-everythingLM-2048](https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T01:01:11.414021](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048/blob/main/results_2023-10-24T01-01-11.414021.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902986076,
"f1": 0.053537122483221615,
"f1_stderr": 0.0012879336042021898,
"acc": 0.3390732138444075,
"acc_stderr": 0.008325489359560807
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902986076,
"f1": 0.053537122483221615,
"f1_stderr": 0.0012879336042021898
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.003366022949726365
},
"harness|winogrande|5": {
"acc": 0.6629834254143646,
"acc_stderr": 0.01328495576939525
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
GeorgeEid/reuters_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ibranze/araproje_arc_tr_conf_gpt2_nearestscore_true | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 50681
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_tr_conf_gpt2_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Technoculture/synthetic-clinical-notes-embedded | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- question-answering
- summarization
pretty_name: Synthetic Clinical Notes
tags:
- starmpcc/Asclepius-Synthetic-Clinical-Notes
- BAAI/bge-small-en-v1.5
- medical
dataset_info:
features:
- name: output
dtype: string
- name: task
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: input_embedding
sequence: float32
- name: output_embedding
sequence: float64
splits:
- name: train
num_bytes: 1199998956
num_examples: 158114
download_size: 967764780
dataset_size: 1199998956
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Synthetic Clinical Notes
This dataset is post-processed version of [starmpcc/Asclepius-Synthetic-Clinical-Notes](https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes):
- Turn into Alpaca format (`instruction`, `input`, and `output`)
- Add embeddings for `input` and `output` columns using [BAAI/bge-small-en-v1.5](https://huggingface.co/datasets/BAAI/bge-small-en-v1.5)
| | Details |
| --------------------- | -------------------------------------------------- |
| Sample Count | 158k |
| Token Count | 648m |
| Origin | https://figshare.com/authors/Zhengyun_Zhao/16480335|
| Source of raw data | PubMed Central (PMC) and MIMIC 3 |
| Processing details | [original](https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes), [paper](https://arxiv.org/pdf/2309.00237.pdf) <a target="_blank" href="https://colab.research.google.com/drive/12nk-nLo46P8GOVqpBIA2wDAYj5SnUGW5?usp=sharing"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/></a> |
| Embedding Model | [BAAI/bge-small-en-v1.5](https://huggingface.co/datasets/BAAI/bge-small-en-v1.5) |
## Data Diversity
| index | Example Output | GPT-4 Rationale | GPT-4 Diversity Rating |
|-------|----------------|-----------------|------------------------|
| 137083| The coreferential expressions used to refer to the patient's severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation in the hospital course section of the discharge summary were "the patient had an irregular heartbeat with a diastolic murmur detected by auscultation" and "Transthoracic echocardiography revealed severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation." | Cardiology, Diagnostic Imaging, Physical Examination | 5 |
| 113558| The coreference resolved in the hospital course section related to the patient's perforation in the sigmoid colon is that the perforation found in the colon was 3-cm long and located 5cm above the anastomosis. This led to a colon segmental resection with loop sigmoid colostomy and subsequent recovery with no complications. | Gastrointestinal Surgery, Perforation Location, Post-surgical Recovery | 5 |
| 97204 | The prescribed biologic medications, Adalimumab and later Certolizumab, were used to treat the resurgence of the patient's tattoo manifestations after tapering of systemic glucocorticoids, but Adalimumab caused an injection site reaction, which prompted a change to Certolizumab. | Pharmacology, Medication Adjustment, Treatment Complications | 5 |
| 53669 | In the hospital course of the discharge summary, coreferences for the patient's respiratory status are resolved using terms such as "her pulmonary clinical signs," "she presented no signs of septic shock," and "her clinical condition finally improved." Coreferences for the patient's treatment are resolved using phrases such as "she was given three doses of spiramycin," "antimicrobial therapy with ceftriaxone was initiated," and "triple antimicrobial therapy with piperacillin-tazobactam, spiramycin, and amikacin was introduced." | Respiratory Infection, Antimicrobial Therapy, Clinical Improvement | 5 |
| 39865 | Using Named Entity Recognition in the discharge summary, the identified named entities related to Stickler syndrome are "Stickler syndrome" and "beaded vitreous phenotype." The identified named entities related to diagnostic testing are "Multiplex Ligation-dependent Probe Amplification (MLPA)" and "exons 41 and 42 [c.3025-3168, p.Gly1009-Val1056]." However, it should be noted that the discharge summary does not provide a comprehensive list of all named entities related to Stickler syndrome and diagnostic testing, and further review of the patient's medical records may be necessary for a complete analysis. | Genetic Testing, Stickler Syndrome, Diagnostic Specificity | 5 |
| 85187 | The patient was diagnosed with metastatic Leydig cell tumour of the spine and underwent surgery through a right subscapular 3rd rib thoracotomy followed by postoperative radiotherapy with radical intent. The patient is advised to follow up regularly as per oncologist's advice and to come back immediately in case of any medical emergency. No discharge medications were given as per the discharge summary. | Oncology, Surgical Approach, Radiotherapy | 5 |
| 99107 | The patient had a complicated problem with their heart's aortic valve and the wall dividing the two chambers of their heart. The valve became detached and the wall had growths on it, likely from an infection. Despite treatment, the patient's condition worsened and they were made comfortable with symptom control and palliative care before passing away. | Cardiac Condition, Palliative Care, End-of-Life | 5 |
| 65981 | The diagnosis for the 10-year-old female patient was a non-displaced scaphoid fracture, and the diagnostic studies used were a dual-energy computed tomography (DECT) scan which showed bone marrow edema (BME) in the scaphoid bone on VNCa images and a confirmatory magnetic resonance imaging (MRI). | Pediatric Orthopedics, Advanced Imaging, Fracture Diagnosis | 5 |
| 68814 | The expanded forms of the abbreviations in the hospital course section are: transnasal endoscopic excision (removal of pituitary adenoma using an endoscope through the nasal cavity) and MRN (medical record number). The diagnosis section abbreviations do not need expansion as they are already spelled out (pituitary adenoma). | Endoscopic Surgery, Pituitary Adenoma, Abbreviation Clarification | 5 |
| 16059 | Based on the given discharge summary, the named entities related to Patient 1's diagnosis of influenza B that can be identified are the diagnosis itself and the prescribed medication, oseltamivir. However, there is no mention of the patient's prior immunization history or any recommendations for future vaccination. Therefore, we cannot fully respond to the healthcare professional's instruction regarding receiving the influenza vaccination to prevent future infections. | Infectious Disease, Influenza B Treatment, Pharmacological Management | 5 |

## Data Lineage
```text
Technoculture/Synthetic-Clinical-Notes
↳ starmpcc/Asclepius-Synthetic-Clinical-Notes
↳ zhengyun21/PMC-Patients [code](https://github.com/zhao-zy15/PMC-Patients)
↳ PubMed Central (PMC)
```
---
> prompt for GPT-4 based annotation on diversity
> ```text
> | index | Example Output |
> |--------|---------------|
> | 137083 | The coreferential expressions used to refer to the patient's severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation in the hospital course section of the discharge summary were "the patient had an irregular heartbeat with a diastolic murmur detected by auscultation" and "Transthoracic echocardiography revealed severe bioprosthetic mitral valve stenosis and severe tricuspid regurgitation." |
>
> for each row, add 2 columns.
>
> Column 3 named 'GPT-4 Rationale': Rationale for how it is is similar or/and diverse with respect to all the other examples in the table.
> Column 4 named 'GPT-4 Diversity Rating': mark for how diverse the example is from all the other examples in the table.
>
> Rating System:
> 0-1: Not Diverse - Almost identical to another example in the table
> 2-3: Very Similar - A somewhat similar example exists in the table
> 4: Fairly Diverse - A fairly dissimilar example from any other example in the table
> 5: Very Diverse - Completely dissimilar to any other example in the table
>
> Return escaped markdown so it can be copied pasted as is.
> ``` |
Miuzarte/SUISovitsDataForSingingModel | ---
language:
- zh
tags:
- AIvtuber
- VirtuaReal
---
# 岁己SUI的sovits歌声模型数据集
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
#### ForSingingModel.zip:
数据质量一般,不建议用于diff-svc等对数据质量要求较高的项目
采样频率为44.1kHz,使用前请注意预处理
取自岁己22年12月、23年1月、23年2月1-17日的录播(除电台,共计268:07:43)、岁己的投稿、[A1in_sy](https://space.bilibili.com/89636742)11月及以前的歌切,经过以下步骤筛选处理
1. 挑取音频码率较高、伴奏音量较低、UVR可较干净去除伴奏的片段(09:31:44)_[[Usable.zip]](https://huggingface.co/datasets/Miuzarte/SUISovitsDataForSingingModel/blob/main/%E6%9C%89%E7%9A%84%E6%B2%A1%E7%9A%84/Usable.zip)
2. [UVR5](https://github.com/Anjok07/ultimatevocalremovergui) VR Architecture 3_HP-Vocal-UVR、4_HP-Vocal-UVR、5_HP-Karaoke-UVR分别处理,尽量除去了BGM中的人声、和声(09:31:43)
3. Adobe Audition手动修剪无用、瑕疵片段(06:58:14)_[[UVR-ed.zip]](https://huggingface.co/datasets/Miuzarte/SUISovitsDataForSingingModel/blob/main/%E6%9C%89%E7%9A%84%E6%B2%A1%E7%9A%84/UVR-ed.zip)
4. [Audio Slicer](https://github.com/flutydeer/audio-slicer)切片并删除过短过长的片段(06:08:52)_[[Slice-d.zip]](https://huggingface.co/datasets/Miuzarte/SUISovitsDataForSingingModel/blob/main/%E6%9C%89%E7%9A%84%E6%B2%A1%E7%9A%84/Slice-d.zip)
5. [Fish Audio Preprocessor](https://github.com/fishaudio/audio-preprocess)响度标准化(06:08:52)_[[ForSingingModel.zip]](https://huggingface.co/datasets/Miuzarte/SUISovitsDataForSingingModel/blob/main/ForSingingModel.zip)
文件结构:
```
ForSingingModel.zip
├── 1.wav
├── ......
├── 911.wav
├── 25788785-20221210-200143-856_01_(Vocals)_0_0.wav
├── ......
└── 25788785-20230217-230042-820_02_(Vocals)_13.wav
```
#### ForSingingModel_sovits3.0.zip:
ForSingingModel.zip经过预处理后的数据集,可以直接投入sovits3.0_48k使用,采样频率为48kHz
文件结构:
```
ForBaseModel_sovits.zip
├── configs
│ └── config.json
├── dataset
│ └── 48k
│ └── suijiSUI
│ ├── 1.wav
│ ├── 1.wav.f0.npy
│ ├── 1.wav.soft.pt
│ ├── ......
│ ├── 25788785-20230217-230042-820_02_(Vocals)_13.wav
│ ├── 25788785-20230217-230042-820_02_(Vocals)_13.wav.f0.npy
│ └── 25788785-20230217-230042-820_02_(Vocals)_13.wav.soft.pt
└── filelists
├── test.txt
├── train.txt
└── val.txt
```
#### ForSingingModel_sovits4.0.zip:
ForSingingModel.zip经过预处理后的数据集,可以直接投入sovits4.0使用,采样频率为44.1kHz
注意:4.0开始config.json中的batch_size默认为6,我又给改回12了
文件结构:
```
ForBaseModel_sovits.zip
├── configs
│ └── config.json
├── dataset
│ └── 44k
│ └── suijiSUI
│ ├── 1.wav
│ ├── 1.wav.f0.npy
│ ├── 1.wav.soft.pt
│ ├── ......
│ ├── 25788785-20230217-230042-820_02_(Vocals)_13.wav
│ ├── 25788785-20230217-230042-820_02_(Vocals)_13.wav.f0.npy
│ └── 25788785-20230217-230042-820_02_(Vocals)_13.wav.soft.pt
└── filelists
├── test.txt
├── train.txt
└── val.txt
```
用到的视频av号:
```
|迷幻慵懒浪漫氛围歌曲| 深夜卧室的氛围感-wait a minute _ av431181253
“整个夏天,想和你环游世界” 试图抓住夏天的尾巴 _ av984968322
3秒带你重回十年前,当年“血洗”qq空间的歌曲,你还记得吗 _ av815358458
3秒让你直呼老公!《I wanna be your slave》 _ av558796317
当我躺在床上摆烂时写的歌 _ av344838098
身体倒是很诚实呢 _ av221246263
试着像楪祈一样温柔地唱“Departures 〜献给你的爱之歌 〜”罪恶王冠ED _ av303334059
试着用治愈的声音唱了《ハレハレヤ》- 朗朗晴天 _ av345498614
【岁己】 366日 _ av561787823
【岁己】 City of Stars _ av561703608
【岁己】 Ghost of a smile _ av689168602
【岁己】 Mela! _ av346648893
【岁己】 Rainbow Girl _ av561705190
【岁己】 The Loneliest Girl _ av732870463
【岁己】 Zzz _ av562589180
【岁己】 ごはんはおかず / 米饭是菜 _ av732063178
【岁己】 たばこ / 烟草 _ av562079329
【岁己】 たばこ _ av473902821
【岁己】 カタオモイ / 单相思 _ av604002659
【岁己】 ギターと孤独と蒼い惑星 / 吉他与孤独与蓝色星球 _ av732714359
【岁己】 万物生 _ av304499468
【岁己】 与你有关 _ av902626120
【岁己】 你的猫咪 _ av346808966
【岁己】 光 _ av219087863
【岁己】 匆匆那年 _ av944906256
【岁己】 唯一 _ av902191203
【岁己】 大风吹 _ av944120506
【岁己】 小半 _ av219092542
【岁己】 左手指月 _ av816979713
【岁己】 干花 _ av773894772
【岁己】 心墙 _ av986376224
【岁己】 忘我 _ av388983298
【岁己】 想和你迎着台风去看海 _ av389690921
【岁己】 摇篮曲 _ av516342753
【岁己】 昨日青空 _ av817017904
【岁己】 暗号 _ av346525048
【岁己】 月牙湾 _ av901604367
【岁己】 有你的快乐 _ av689087340
【岁己】 杀死那个石家庄人 _ av732149102
【岁己】 歌舞伎町の女王 _ av262050432
【岁己】 残酷な天使のテーゼ _ av901194411
【岁己】 流年 _ av987548313
【岁己】 浴室 _ av561382034
【岁己】 理想情人 _ av520236739
【岁己】 白金DISCO _ av646240416
【岁己】 砂糖之歌与苦味舞步 _ av986766899
【岁己】 糸 _ av774272827
【岁己】 红豆 _ av816694580
【岁己】 致姗姗来迟的你 _ av520099130
【岁己】 若把你 _ av562184161
【岁己】 落日 _ av219066825
【岁己】 走马 _ av816599983
【岁己】 远旅休憩中的邂逅 _ av689278570
【岁己】 迷迭香 _ av901800711
【岁己】 逆光 _ av901580501
【岁己】 钻石裂痕 _ av558645765
【岁己】 香格里拉 _ av346809187
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sproos/twitter-pairclass-es | ---
dataset_info:
features:
- name: sent1
sequence: string
- name: sent2
sequence: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 11427395
num_examples: 1
download_size: 4228525
dataset_size: 11427395
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter-pairclass-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
C-MTEB/MMarcoRetrieval-qrels | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: qid
dtype: string
- name: pid
dtype: string
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 217670
num_examples: 7437
download_size: 113896
dataset_size: 217670
---
# Dataset Card for "MMarcoRetrieval-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TeeA/Vietnamese-Chart-Dataset | ---
dataset_info:
features:
- name: title
dtype: string
- name: x_title
dtype: string
- name: y_title
dtype: string
- name: x
dtype: string
- name: y
dtype: string
- name: file_name
dtype: string
- name: chart_type
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 115631536.42857143
num_examples: 5000
- name: test
num_bytes: 23422771.285714287
num_examples: 1000
- name: validation
num_bytes: 23502759.285714287
num_examples: 1000
download_size: 116048333
dataset_size: 162557067.00000003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
muverrih38/1231243141231 | ---
license: other
---
|
mike008/wedo | ---
license: openrail
---
|
karimasbar/test1 | ---
license: mit
---
|
h2oai/openassistant_oasst1_h2ogpt_graded | ---
license: apache-2.0
language:
- en
thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
tags:
- gpt
- llm
- large language model
- open-source
---
# h2oGPT Data Card
## Summary
H2O.ai's `openassistant_oasst1_h2ogpt_graded` is an open-source instruct-type dataset for fine-tuning of large language models, licensed for commercial use.
- Number of rows: `30368`
- Number of columns: `5`
- Column names: `['input', 'source', 'prompt_type', 'grade_deberta', 'id']`
## Source
- [Original Open Assistant data in tree structure](https://huggingface.co/datasets/OpenAssistant/oasst1)
- [This flattened dataset created by script in h2oGPT repository](https://github.com/h2oai/h2ogpt/blob/d1f8ce975a46056d41135d126dd33de8499aa26e/create_data.py#L1259)
|
Coaso/yokote_test | ---
license: apache-2.0
task_categories:
- table-question-answering
language:
- ja
size_categories:
- n<1K
---
test |
TinyPixel/lima-1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1794727
num_examples: 780
download_size: 1043400
dataset_size: 1794727
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ljsilverstar/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 11978694
num_examples: 1000
download_size: 3382926
dataset_size: 11978694
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Chr0my/public_flickr_photos_license_1 | ---
license: cc-by-nc-sa-3.0
---
119893266 photos from flickr (https://www.flickr.com/creativecommons/by-nc-sa-2.0/)
---
all photos are under license id = 1 name=Attribution-NonCommercial-ShareAlike License url=https://creativecommons.org/licenses/by-nc-sa/2.0/ |
Nasiat/IUT_Regional_STT_Dataset | ---
license: apache-2.0
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
- name: input_length
dtype: int64
splits:
- name: train
num_bytes: 67472
num_examples: 20
download_size: 8032
dataset_size: 67472
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
youdiniplays/tagalog_to_waray | ---
license: apache-2.0
task_categories:
- translation
language:
- tl
--- |
T-Almeida/Pubmed2023-baseline-neox-tokenized-len | ---
dataset_info:
features:
- name: id
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: length
dtype: int64
splits:
- name: train
num_bytes: 33459575717
num_examples: 22522740
download_size: 14089336992
dataset_size: 33459575717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
newsroom | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: CORNELL NEWSROOM
size_categories:
- unknown
source_datasets:
- original
task_categories:
- summarization
task_ids:
- news-articles-summarization
paperswithcode_id: newsroom
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: date
dtype: string
- name: density_bin
dtype: string
- name: coverage_bin
dtype: string
- name: compression_bin
dtype: string
- name: density
dtype: float32
- name: coverage
dtype: float32
- name: compression
dtype: float32
splits:
- name: test
num_bytes: 472446866
num_examples: 108862
- name: train
num_bytes: 4357506078
num_examples: 995041
- name: validation
num_bytes: 473206951
num_examples: 108837
download_size: 0
dataset_size: 5303159895
---
# Dataset Card for "newsroom"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://lil.nlp.cornell.edu/newsroom/index.html](https://lil.nlp.cornell.edu/newsroom/index.html)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 0.00 MB
- **Size of the generated dataset:** 5.30 GB
- **Total amount of disk used:** 5.30 GB
### Dataset Summary
NEWSROOM is a large dataset for training and evaluating summarization systems.
It contains 1.3 million articles and summaries written by authors and
editors in the newsrooms of 38 major publications.
Dataset features includes:
- text: Input news text.
- summary: Summary for the news.
And additional features:
- title: news title.
- url: url of the news.
- date: date of the article.
- density: extractive density.
- coverage: extractive coverage.
- compression: compression ratio.
- density_bin: low, medium, high.
- coverage_bin: extractive, abstractive.
- compression_bin: low, medium, high.
This dataset can be downloaded upon requests. Unzip all the contents
"train.jsonl, dev.josnl, test.jsonl" to the `tfds` folder.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
English (`en`).
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 0.00 MB
- **Size of the generated dataset:** 5.30 GB
- **Total amount of disk used:** 5.30 GB
An example of 'train' looks as follows.
```
{
"compression": 33.880001068115234,
"compression_bin": "medium",
"coverage": 1.0,
"coverage_bin": "high",
"date": "200600000",
"density": 11.720000267028809,
"density_bin": "extractive",
"summary": "some summary 1",
"text": "some text 1",
"title": "news title 1",
"url": "url.html"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `text`: a `string` feature.
- `summary`: a `string` feature.
- `title`: a `string` feature.
- `url`: a `string` feature.
- `date`: a `string` feature.
- `density_bin`: a `string` feature.
- `coverage_bin`: a `string` feature.
- `compression_bin`: a `string` feature.
- `density`: a `float32` feature.
- `coverage`: a `float32` feature.
- `compression`: a `float32` feature.
### Data Splits
| name |train |validation| test |
|-------|-----:|---------:|-----:|
|default|995041| 108837|108862|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
https://cornell.qualtrics.com/jfe/form/SV_6YA3HQ2p75XH4IR
This Dataset Usage Agreement ("Agreement") is a legal agreement with the Cornell Newsroom Summaries Team ("Newsroom") for the Dataset made available to the individual or entity ("Researcher") exercising rights under this Agreement. "Dataset" includes all text, data, information, source code, and any related materials, documentation, files, media, updates or revisions.
The Dataset is intended for non-commercial research and educational purposes only, and is made available free of charge without extending any license or other intellectual property rights. By downloading or using the Dataset, the Researcher acknowledges that they agree to the terms in this Agreement, and represent and warrant that they have authority to do so on behalf of any entity exercising rights under this Agreement. The Researcher accepts and agrees to be bound by the terms and conditions of this Agreement. If the Researcher does not agree to this Agreement, they may not download or use the Dataset.
By sharing content with Newsroom, such as by submitting content to this site or by corresponding with Newsroom contributors, the Researcher grants Newsroom the right to use, reproduce, display, perform, adapt, modify, distribute, have distributed, and promote the content in any form, anywhere and for any purpose, such as for evaluating and comparing summarization systems. Nothing in this Agreement shall obligate Newsroom to provide any support for the Dataset. Any feedback, suggestions, ideas, comments, improvements given by the Researcher related to the Dataset is voluntarily given, and may be used by Newsroom without obligation or restriction of any kind.
The Researcher accepts full responsibility for their use of the Dataset and shall defend indemnify, and hold harmless Newsroom, including their employees, trustees, officers, and agents, against any and all claims arising from the Researcher's use of the Dataset. The Researcher agrees to comply with all laws and regulations as they relate to access to and use of the Dataset and Service including U.S. export jurisdiction and other U.S. and international regulations.
THE DATASET IS PROVIDED "AS IS." NEWSROOM DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. WITHOUT LIMITATION OF THE ABOVE, NEWSROOM DISCLAIMS ANY WARRANTY THAT DATASET IS BUG OR ERROR-FREE, AND GRANTS NO WARRANTY REGARDING ITS USE OR THE RESULTS THEREFROM INCLUDING, WITHOUT LIMITATION, ITS CORRECTNESS, ACCURACY, OR RELIABILITY. THE DATASET IS NOT WARRANTIED TO FULFILL ANY PARTICULAR PURPOSES OR NEEDS.
TO THE EXTENT NOT PROHIBITED BY LAW, IN NO EVENT SHALL NEWSROOM BE LIABLE FOR ANY LOSS, DAMAGE OR INJURY, DIRECT AND INDIRECT, INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES, HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER FOR BREACH OF CONTRACT, TORT (INCLUDING NEGLIGENCE) OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, INCLUDING BUT NOT LIMITED TO LOSS OF PROFITS, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. THESE LIMITATIONS SHALL APPLY NOTWITHSTANDING ANY FAILURE OF ESSENTIAL PURPOSE OF ANY LIMITED REMEDY.
This Agreement is effective until terminated. Newsroom reserves the right to terminate the Researcher's access to the Dataset at any time. If the Researcher breaches this Agreement, the Researcher's rights to use the Dataset shall terminate automatically. The Researcher will immediately cease all use and distribution of the Dataset and destroy any copies or portions of the Dataset in their possession.
This Agreement is governed by the laws of the State of New York, without regard to conflict of law principles. All terms and provisions of this Agreement shall, if possible, be construed in a manner which makes them valid, but in the event any term or provision of this Agreement is found by a court of competent jurisdiction to be illegal or unenforceable, the validity or enforceability of the remainder of this Agreement shall not be affected.
This Agreement is the complete and exclusive agreement between the parties with respect to its subject matter and supersedes all prior or contemporaneous oral or written agreements or understandings relating to the subject matter.
### Citation Information
```
@inproceedings{N18-1065,
author = {Grusky, Max and Naaman, Mor and Artzi, Yoav},
title = {NEWSROOM: A Dataset of 1.3 Million Summaries
with Diverse Extractive Strategies},
booktitle = {Proceedings of the 2018 Conference of the
North American Chapter of the Association for
Computational Linguistics: Human Language Technologies},
year = {2018},
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten), [@yoavartzi](https://github.com/yoavartzi), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
mlabonne/ministack-preferences | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
splits:
- name: train
num_bytes: 2648404.1478804825
num_examples: 1000
- name: test
num_bytes: 2648404.1478804825
num_examples: 1000
download_size: 3061144
dataset_size: 5296808.295760965
---
# Ministack-preferences
Subset (1000 training samples and 1000 test samples) of the [`lvwerra/stack-exchange-paired`](https://huggingface.co/datasets/lvwerra/stack-exchange-paired) dataset. The original dataset is really heavy and long to process, so hopefully this will help you to try RLHF a little faster. |
Nubletz/test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': MSI
'1': MSS
splits:
- name: train
num_bytes: 112885.0
num_examples: 4
download_size: 114763
dataset_size: 112885.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SLB13/X1_Test | ---
license: apache-2.0
---
|
winglian/deduped-cortex-test002 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: run_id
dtype: string
- name: step
dtype: int64
- name: uid
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 61157002.37552902
num_examples: 26458
download_size: 31624894
dataset_size: 61157002.37552902
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
khalidalt/sungai_ul2_instructions | ---
dataset_info:
features:
- name: text
dtype: string
- name: metadata
struct:
- name: source
dtype: string
splits:
- name: train
num_bytes: 101304845
num_examples: 200000
download_size: 59812835
dataset_size: 101304845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sungai_ul2_instructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sameeryel/3D_Models | ---
license: unknown
---
|
open-llm-leaderboard/details_OpenAssistant__codellama-13b-oasst-sft-v10 | ---
pretty_name: Evaluation run of OpenAssistant/codellama-13b-oasst-sft-v10
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenAssistant/codellama-13b-oasst-sft-v10](https://huggingface.co/OpenAssistant/codellama-13b-oasst-sft-v10)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__codellama-13b-oasst-sft-v10\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T06:23:43.342371](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__codellama-13b-oasst-sft-v10/blob/main/results_2023-10-15T06-23-43.342371.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626669533,\n \"f1\": 0.07171875000000016,\n\
\ \"f1_stderr\": 0.0015908122454952622,\n \"acc\": 0.4049487994360847,\n\
\ \"acc_stderr\": 0.011226667727964289\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669533,\n\
\ \"f1\": 0.07171875000000016,\n \"f1_stderr\": 0.0015908122454952622\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13191811978771797,\n \
\ \"acc_stderr\": 0.009321265253857515\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6779794790844514,\n \"acc_stderr\": 0.013132070202071064\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenAssistant/codellama-13b-oasst-sft-v10
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|arc:challenge|25_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|arc:challenge|25_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|arc:challenge|25_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T06_23_43.342371
path:
- '**/details_harness|drop|3_2023-10-15T06-23-43.342371.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T06-23-43.342371.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T06_23_43.342371
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-23-43.342371.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-23-43.342371.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hellaswag|10_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hellaswag|10_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hellaswag|10_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:42:44.871031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T18:08:08.712288.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-15-45.768968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T09:42:44.871031.parquet'
- split: 2023_08_28T18_08_08.712288
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T18:08:08.712288.parquet'
- split: 2023_09_18T15_15_45.768968
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T15-15-45.768968.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T15-15-45.768968.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T06_23_43.342371
path:
- '**/details_harness|winogrande|5_2023-10-15T06-23-43.342371.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T06-23-43.342371.parquet'
- config_name: results
data_files:
- split: 2023_08_28T09_42_44.871031
path:
- results_2023-08-28T09:42:44.871031.parquet
- split: 2023_08_28T18_08_08.712288
path:
- results_2023-08-28T18:08:08.712288.parquet
- split: 2023_09_18T15_15_45.768968
path:
- results_2023-09-18T15-15-45.768968.parquet
- split: 2023_10_15T06_23_43.342371
path:
- results_2023-10-15T06-23-43.342371.parquet
- split: latest
path:
- results_2023-10-15T06-23-43.342371.parquet
---
# Dataset Card for Evaluation run of OpenAssistant/codellama-13b-oasst-sft-v10
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/codellama-13b-oasst-sft-v10
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/codellama-13b-oasst-sft-v10](https://huggingface.co/OpenAssistant/codellama-13b-oasst-sft-v10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__codellama-13b-oasst-sft-v10",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T06:23:43.342371](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__codellama-13b-oasst-sft-v10/blob/main/results_2023-10-15T06-23-43.342371.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669533,
"f1": 0.07171875000000016,
"f1_stderr": 0.0015908122454952622,
"acc": 0.4049487994360847,
"acc_stderr": 0.011226667727964289
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669533,
"f1": 0.07171875000000016,
"f1_stderr": 0.0015908122454952622
},
"harness|gsm8k|5": {
"acc": 0.13191811978771797,
"acc_stderr": 0.009321265253857515
},
"harness|winogrande|5": {
"acc": 0.6779794790844514,
"acc_stderr": 0.013132070202071064
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ParasiticRogue/Bluemoon-Tiny-Light | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- not-for-all-audiences
---
Only useful if you need it really small.
Original dataset below:
https://huggingface.co/datasets/ParasiticRogue/Bluemoon-Light?not-for-all-audiences=true
|
awaisakhtar/order_dataset | ---
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- question-answering
- conversational
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: System_Prompt
dtype: string
- name: Instruction
dtype: string
- name: Context
dtype: string
- name: Menu
dtype: string
- name: Conversation_History
dtype: string
- name: Response
dtype: string
splits:
- name: train
num_bytes: 25430540
num_examples: 5140
download_size: 1277262
dataset_size: 25430540
tags:
- order
---
# Dataset Card for "order_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_207 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1124125196.0
num_examples: 220763
download_size: 1148865336
dataset_size: 1124125196.0
---
# Dataset Card for "chunk_207"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
semarmendemx/csst2 | ---
license: apache-2.0
tags:
- language
--- |
nc33/MultiSpan_SQUAD | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: num_span
dtype: int64
- name: label
sequence: string
splits:
- name: train
num_bytes: 141866486
num_examples: 87599
- name: validation
num_bytes: 18219759
num_examples: 10570
download_size: 16941350
dataset_size: 160086245
---
|
balakhonoff/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sowmya15/profanity | ---
license: mit
---
|
omupadhye/graphene_thesis | ---
license: openrail
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
severo/bug-16718038814382 | ---
dataset_info:
features:
- name: a
dtype: int64
splits:
- name: train
num_bytes: 24
num_examples: 3
download_size: 579
dataset_size: 24
---
# Dataset Card for "bug-16718038814382"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MaryDatascientist/en_dataset | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: lang
dtype: string
splits:
- name: train
num_bytes: 98357684.72831541
num_examples: 262560
- name: validation
num_bytes: 12407333.773835126
num_examples: 32820
- name: test
num_bytes: 12419125.385081522
num_examples: 32908
download_size: 30300039
dataset_size: 123184143.88723207
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Tristan/olm-october-2022-tokenized-1024-suffix-array-dedup | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 81147320856
num_examples: 13181826
download_size: 21892490583
dataset_size: 81147320856
---
# Dataset Card for "olm-october-2022-tokenized-1024-suffix-array-dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
binxia/LLMGA-dataset | ---
license: apache-2.0
---
|
CVasNLPExperiments/VQAv2_sample_validation_benchmarks_partition_6 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 58
num_examples: 2
download_size: 1368
dataset_size: 58
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anezatra/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/GPTeacher_roleplay_standardized_cluster_0_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 449236
num_examples: 1416
download_size: 244810
dataset_size: 449236
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPTeacher_roleplay_standardized_cluster_0_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-13b | ---
pretty_name: Evaluation run of ziqingyang/chinese-alpaca-2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ziqingyang/chinese-alpaca-2-13b](https://huggingface.co/ziqingyang/chinese-alpaca-2-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T20:22:27.142442](https://huggingface.co/datasets/open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-13b/blob/main/results_2023-10-15T20-22-27.142442.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32728607382550334,\n\
\ \"em_stderr\": 0.004805279168508311,\n \"f1\": 0.4106134647651026,\n\
\ \"f1_stderr\": 0.004650726360819101,\n \"acc\": 0.4307653965208868,\n\
\ \"acc_stderr\": 0.010243166856230161\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.32728607382550334,\n \"em_stderr\": 0.004805279168508311,\n\
\ \"f1\": 0.4106134647651026,\n \"f1_stderr\": 0.004650726360819101\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10462471569370735,\n \
\ \"acc_stderr\": 0.008430668082029278\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431043\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ziqingyang/chinese-alpaca-2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T20_22_27.142442
path:
- '**/details_harness|drop|3_2023-10-15T20-22-27.142442.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T20-22-27.142442.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T20_22_27.142442
path:
- '**/details_harness|gsm8k|5_2023-10-15T20-22-27.142442.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T20-22-27.142442.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T20_22_27.142442
path:
- '**/details_harness|winogrande|5_2023-10-15T20-22-27.142442.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T20-22-27.142442.parquet'
- config_name: results
data_files:
- split: 2023_10_15T20_22_27.142442
path:
- results_2023-10-15T20-22-27.142442.parquet
- split: latest
path:
- results_2023-10-15T20-22-27.142442.parquet
---
# Dataset Card for Evaluation run of ziqingyang/chinese-alpaca-2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ziqingyang/chinese-alpaca-2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ziqingyang/chinese-alpaca-2-13b](https://huggingface.co/ziqingyang/chinese-alpaca-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T20:22:27.142442](https://huggingface.co/datasets/open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-13b/blob/main/results_2023-10-15T20-22-27.142442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.32728607382550334,
"em_stderr": 0.004805279168508311,
"f1": 0.4106134647651026,
"f1_stderr": 0.004650726360819101,
"acc": 0.4307653965208868,
"acc_stderr": 0.010243166856230161
},
"harness|drop|3": {
"em": 0.32728607382550334,
"em_stderr": 0.004805279168508311,
"f1": 0.4106134647651026,
"f1_stderr": 0.004650726360819101
},
"harness|gsm8k|5": {
"acc": 0.10462471569370735,
"acc_stderr": 0.008430668082029278
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Joe02/mizushima_oonari | ---
license: other
---
|
ai-shift/ameba_faq_search | ---
task_categories:
- question-answering
language:
- ja
size_categories:
- 100K<n<1M
license: cc-by-nd-4.0
---
# AMEBA Blog FAQ Search Dataset
This data was obtained by crawling [this website](https://helps.ameba.jp/faq/).
The FAQ Data was processed to remove HTML tags and other formatting after crawling, and entries containing excessively long content were excluded.
The Query Data was generated using a Large Language Model (LLM). Please refer to the following blog for information about the generation process.
- https://www.ai-shift.co.jp/techblog/3710
- https://www.ai-shift.co.jp/techblog/3761
## Column description
FAQ Data (target_faq.csv)
- ID: Unique ID of the FAQ
- Title: Title of the FAQ
- Content: Answer content of the FAQ
Query Data (queries_{train/validation/test}.csv)
- ID: Unique ID of the correct FAQ
- Query: Question text
- difficulty: The difficulty level of the problem
- Whether the problem is related to the correct FAQ in the training set.
- If "easy", it is included in the train data, and if "difficult", it is not included in the train data.
- The train data are all "easy". |
nathancday/imagenet_sketch_mini | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': tench, Tinca tinca
'1': goldfish, Carassius auratus
'2': great white shark, white shark, man-eater, man-eating shark, Carcharodon
carcharias
'3': tiger shark, Galeocerdo cuvieri
'4': hammerhead, hammerhead shark
'5': electric ray, crampfish, numbfish, torpedo
'6': stingray
'7': cock
'8': hen
'9': ostrich, Struthio camelus
'10': brambling, Fringilla montifringilla
'11': goldfinch, Carduelis carduelis
'12': house finch, linnet, Carpodacus mexicanus
'13': junco, snowbird
'14': indigo bunting, indigo finch, indigo bird, Passerina cyanea
'15': robin, American robin, Turdus migratorius
'16': bulbul
'17': jay
'18': magpie
'19': chickadee
'20': water ouzel, dipper
'21': kite
'22': bald eagle, American eagle, Haliaeetus leucocephalus
'23': vulture
'24': great grey owl, great gray owl, Strix nebulosa
'25': European fire salamander, Salamandra salamandra
'26': common newt, Triturus vulgaris
'27': eft
'28': spotted salamander, Ambystoma maculatum
'29': axolotl, mud puppy, Ambystoma mexicanum
'30': bullfrog, Rana catesbeiana
'31': tree frog, tree-frog
'32': tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui
'33': loggerhead, loggerhead turtle, Caretta caretta
'34': leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea
'35': mud turtle
'36': terrapin
'37': box turtle, box tortoise
'38': banded gecko
'39': common iguana, iguana, Iguana iguana
'40': American chameleon, anole, Anolis carolinensis
'41': whiptail, whiptail lizard
'42': agama
'43': frilled lizard, Chlamydosaurus kingi
'44': alligator lizard
'45': Gila monster, Heloderma suspectum
'46': green lizard, Lacerta viridis
'47': African chameleon, Chamaeleo chamaeleon
'48': Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus
komodoensis
'49': African crocodile, Nile crocodile, Crocodylus niloticus
'50': American alligator, Alligator mississipiensis
'51': triceratops
'52': thunder snake, worm snake, Carphophis amoenus
'53': ringneck snake, ring-necked snake, ring snake
'54': hognose snake, puff adder, sand viper
'55': green snake, grass snake
'56': king snake, kingsnake
'57': garter snake, grass snake
'58': water snake
'59': vine snake
'60': night snake, Hypsiglena torquata
'61': boa constrictor, Constrictor constrictor
'62': rock python, rock snake, Python sebae
'63': Indian cobra, Naja naja
'64': green mamba
'65': sea snake
'66': horned viper, cerastes, sand viper, horned asp, Cerastes cornutus
'67': diamondback, diamondback rattlesnake, Crotalus adamanteus
'68': sidewinder, horned rattlesnake, Crotalus cerastes
'69': trilobite
'70': harvestman, daddy longlegs, Phalangium opilio
'71': scorpion
'72': black and gold garden spider, Argiope aurantia
'73': barn spider, Araneus cavaticus
'74': garden spider, Aranea diademata
'75': black widow, Latrodectus mactans
'76': tarantula
'77': wolf spider, hunting spider
'78': tick
'79': centipede
'80': black grouse
'81': ptarmigan
'82': ruffed grouse, partridge, Bonasa umbellus
'83': prairie chicken, prairie grouse, prairie fowl
'84': peacock
'85': quail
'86': partridge
'87': African grey, African gray, Psittacus erithacus
'88': macaw
'89': sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita
'90': lorikeet
'91': coucal
'92': bee eater
'93': hornbill
'94': hummingbird
'95': jacamar
'96': toucan
'97': drake
'98': red-breasted merganser, Mergus serrator
'99': goose
'100': black swan, Cygnus atratus
'101': tusker
'102': echidna, spiny anteater, anteater
'103': platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus
anatinus
'104': wallaby, brush kangaroo
'105': koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus
'106': wombat
'107': jellyfish
'108': sea anemone, anemone
'109': brain coral
'110': flatworm, platyhelminth
'111': nematode, nematode worm, roundworm
'112': conch
'113': snail
'114': slug
'115': sea slug, nudibranch
'116': chiton, coat-of-mail shell, sea cradle, polyplacophore
'117': chambered nautilus, pearly nautilus, nautilus
'118': Dungeness crab, Cancer magister
'119': rock crab, Cancer irroratus
'120': fiddler crab
'121': king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes
camtschatica
'122': American lobster, Northern lobster, Maine lobster, Homarus americanus
'123': spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish
'124': crayfish, crawfish, crawdad, crawdaddy
'125': hermit crab
'126': isopod
'127': white stork, Ciconia ciconia
'128': black stork, Ciconia nigra
'129': spoonbill
'130': flamingo
'131': little blue heron, Egretta caerulea
'132': American egret, great white heron, Egretta albus
'133': bittern
'134': crane
'135': limpkin, Aramus pictus
'136': European gallinule, Porphyrio porphyrio
'137': American coot, marsh hen, mud hen, water hen, Fulica americana
'138': bustard
'139': ruddy turnstone, Arenaria interpres
'140': red-backed sandpiper, dunlin, Erolia alpina
'141': redshank, Tringa totanus
'142': dowitcher
'143': oystercatcher, oyster catcher
'144': pelican
'145': king penguin, Aptenodytes patagonica
'146': albatross, mollymawk
'147': grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius
robustus
'148': killer whale, killer, orca, grampus, sea wolf, Orcinus orca
'149': dugong, Dugong dugon
'150': sea lion
'151': Chihuahua
'152': Japanese spaniel
'153': Maltese dog, Maltese terrier, Maltese
'154': Pekinese, Pekingese, Peke
'155': Shih-Tzu
'156': Blenheim spaniel
'157': papillon
'158': toy terrier
'159': Rhodesian ridgeback
'160': Afghan hound, Afghan
'161': basset, basset hound
'162': beagle
'163': bloodhound, sleuthhound
'164': bluetick
'165': black-and-tan coonhound
'166': Walker hound, Walker foxhound
'167': English foxhound
'168': redbone
'169': borzoi, Russian wolfhound
'170': Irish wolfhound
'171': Italian greyhound
'172': whippet
'173': Ibizan hound, Ibizan Podenco
'174': Norwegian elkhound, elkhound
'175': otterhound, otter hound
'176': Saluki, gazelle hound
'177': Scottish deerhound, deerhound
'178': Weimaraner
'179': Staffordshire bullterrier, Staffordshire bull terrier
'180': American Staffordshire terrier, Staffordshire terrier, American pit
bull terrier, pit bull terrier
'181': Bedlington terrier
'182': Border terrier
'183': Kerry blue terrier
'184': Irish terrier
'185': Norfolk terrier
'186': Norwich terrier
'187': Yorkshire terrier
'188': wire-haired fox terrier
'189': Lakeland terrier
'190': Sealyham terrier, Sealyham
'191': Airedale, Airedale terrier
'192': cairn, cairn terrier
'193': Australian terrier
'194': Dandie Dinmont, Dandie Dinmont terrier
'195': Boston bull, Boston terrier
'196': miniature schnauzer
'197': giant schnauzer
'198': standard schnauzer
'199': Scotch terrier, Scottish terrier, Scottie
'200': Tibetan terrier, chrysanthemum dog
'201': silky terrier, Sydney silky
'202': soft-coated wheaten terrier
'203': West Highland white terrier
'204': Lhasa, Lhasa apso
'205': flat-coated retriever
'206': curly-coated retriever
'207': golden retriever
'208': Labrador retriever
'209': Chesapeake Bay retriever
'210': German short-haired pointer
'211': vizsla, Hungarian pointer
'212': English setter
'213': Irish setter, red setter
'214': Gordon setter
'215': Brittany spaniel
'216': clumber, clumber spaniel
'217': English springer, English springer spaniel
'218': Welsh springer spaniel
'219': cocker spaniel, English cocker spaniel, cocker
'220': Sussex spaniel
'221': Irish water spaniel
'222': kuvasz
'223': schipperke
'224': groenendael
'225': malinois
'226': briard
'227': kelpie
'228': komondor
'229': Old English sheepdog, bobtail
'230': Shetland sheepdog, Shetland sheep dog, Shetland
'231': collie
'232': Border collie
'233': Bouvier des Flandres, Bouviers des Flandres
'234': Rottweiler
'235': German shepherd, German shepherd dog, German police dog, alsatian
'236': Doberman, Doberman pinscher
'237': miniature pinscher
'238': Greater Swiss Mountain dog
'239': Bernese mountain dog
'240': Appenzeller
'241': EntleBucher
'242': boxer
'243': bull mastiff
'244': Tibetan mastiff
'245': French bulldog
'246': Great Dane
'247': Saint Bernard, St Bernard
'248': Eskimo dog, husky
'249': malamute, malemute, Alaskan malamute
'250': Siberian husky
'251': dalmatian, coach dog, carriage dog
'252': affenpinscher, monkey pinscher, monkey dog
'253': basenji
'254': pug, pug-dog
'255': Leonberg
'256': Newfoundland, Newfoundland dog
'257': Great Pyrenees
'258': Samoyed, Samoyede
'259': Pomeranian
'260': chow, chow chow
'261': keeshond
'262': Brabancon griffon
'263': Pembroke, Pembroke Welsh corgi
'264': Cardigan, Cardigan Welsh corgi
'265': toy poodle
'266': miniature poodle
'267': standard poodle
'268': Mexican hairless
'269': timber wolf, grey wolf, gray wolf, Canis lupus
'270': white wolf, Arctic wolf, Canis lupus tundrarum
'271': red wolf, maned wolf, Canis rufus, Canis niger
'272': coyote, prairie wolf, brush wolf, Canis latrans
'273': dingo, warrigal, warragal, Canis dingo
'274': dhole, Cuon alpinus
'275': African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus
'276': hyena, hyaena
'277': red fox, Vulpes vulpes
'278': kit fox, Vulpes macrotis
'279': Arctic fox, white fox, Alopex lagopus
'280': grey fox, gray fox, Urocyon cinereoargenteus
'281': tabby, tabby cat
'282': tiger cat
'283': Persian cat
'284': Siamese cat, Siamese
'285': Egyptian cat
'286': cougar, puma, catamount, mountain lion, painter, panther, Felis concolor
'287': lynx, catamount
'288': leopard, Panthera pardus
'289': snow leopard, ounce, Panthera uncia
'290': jaguar, panther, Panthera onca, Felis onca
'291': lion, king of beasts, Panthera leo
'292': tiger, Panthera tigris
'293': cheetah, chetah, Acinonyx jubatus
'294': brown bear, bruin, Ursus arctos
'295': American black bear, black bear, Ursus americanus, Euarctos americanus
'296': ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus
'297': sloth bear, Melursus ursinus, Ursus ursinus
'298': mongoose
'299': meerkat, mierkat
'300': tiger beetle
'301': ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle
'302': ground beetle, carabid beetle
'303': long-horned beetle, longicorn, longicorn beetle
'304': leaf beetle, chrysomelid
'305': dung beetle
'306': rhinoceros beetle
'307': weevil
'308': fly
'309': bee
'310': ant, emmet, pismire
'311': grasshopper, hopper
'312': cricket
'313': walking stick, walkingstick, stick insect
'314': cockroach, roach
'315': mantis, mantid
'316': cicada, cicala
'317': leafhopper
'318': lacewing, lacewing fly
'319': dragonfly, darning needle, devil's darning needle, sewing needle,
snake feeder, snake doctor, mosquito hawk, skeeter hawk
'320': damselfly
'321': admiral
'322': ringlet, ringlet butterfly
'323': monarch, monarch butterfly, milkweed butterfly, Danaus plexippus
'324': cabbage butterfly
'325': sulphur butterfly, sulfur butterfly
'326': lycaenid, lycaenid butterfly
'327': starfish, sea star
'328': sea urchin
'329': sea cucumber, holothurian
'330': wood rabbit, cottontail, cottontail rabbit
'331': hare
'332': Angora, Angora rabbit
'333': hamster
'334': porcupine, hedgehog
'335': fox squirrel, eastern fox squirrel, Sciurus niger
'336': marmot
'337': beaver
'338': guinea pig, Cavia cobaya
'339': sorrel
'340': zebra
'341': hog, pig, grunter, squealer, Sus scrofa
'342': wild boar, boar, Sus scrofa
'343': warthog
'344': hippopotamus, hippo, river horse, Hippopotamus amphibius
'345': ox
'346': water buffalo, water ox, Asiatic buffalo, Bubalus bubalis
'347': bison
'348': ram, tup
'349': bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain
sheep, Ovis canadensis
'350': ibex, Capra ibex
'351': hartebeest
'352': impala, Aepyceros melampus
'353': gazelle
'354': Arabian camel, dromedary, Camelus dromedarius
'355': llama
'356': weasel
'357': mink
'358': polecat, fitch, foulmart, foumart, Mustela putorius
'359': black-footed ferret, ferret, Mustela nigripes
'360': otter
'361': skunk, polecat, wood pussy
'362': badger
'363': armadillo
'364': three-toed sloth, ai, Bradypus tridactylus
'365': orangutan, orang, orangutang, Pongo pygmaeus
'366': gorilla, Gorilla gorilla
'367': chimpanzee, chimp, Pan troglodytes
'368': gibbon, Hylobates lar
'369': siamang, Hylobates syndactylus, Symphalangus syndactylus
'370': guenon, guenon monkey
'371': patas, hussar monkey, Erythrocebus patas
'372': baboon
'373': macaque
'374': langur
'375': colobus, colobus monkey
'376': proboscis monkey, Nasalis larvatus
'377': marmoset
'378': capuchin, ringtail, Cebus capucinus
'379': howler monkey, howler
'380': titi, titi monkey
'381': spider monkey, Ateles geoffroyi
'382': squirrel monkey, Saimiri sciureus
'383': Madagascar cat, ring-tailed lemur, Lemur catta
'384': indri, indris, Indri indri, Indri brevicaudatus
'385': Indian elephant, Elephas maximus
'386': African elephant, Loxodonta africana
'387': lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens
'388': giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca
'389': barracouta, snoek
'390': eel
'391': coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus
kisutch
'392': rock beauty, Holocanthus tricolor
'393': anemone fish
'394': sturgeon
'395': gar, garfish, garpike, billfish, Lepisosteus osseus
'396': lionfish
'397': puffer, pufferfish, blowfish, globefish
'398': abacus
'399': abaya
'400': academic gown, academic robe, judge's robe
'401': accordion, piano accordion, squeeze box
'402': acoustic guitar
'403': aircraft carrier, carrier, flattop, attack aircraft carrier
'404': airliner
'405': airship, dirigible
'406': altar
'407': ambulance
'408': amphibian, amphibious vehicle
'409': analog clock
'410': apiary, bee house
'411': apron
'412': ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin,
dustbin, trash barrel, trash bin
'413': assault rifle, assault gun
'414': backpack, back pack, knapsack, packsack, rucksack, haversack
'415': bakery, bakeshop, bakehouse
'416': balance beam, beam
'417': balloon
'418': ballpoint, ballpoint pen, ballpen, Biro
'419': Band Aid
'420': banjo
'421': bannister, banister, balustrade, balusters, handrail
'422': barbell
'423': barber chair
'424': barbershop
'425': barn
'426': barometer
'427': barrel, cask
'428': barrow, garden cart, lawn cart, wheelbarrow
'429': baseball
'430': basketball
'431': bassinet
'432': bassoon
'433': bathing cap, swimming cap
'434': bath towel
'435': bathtub, bathing tub, bath, tub
'436': beach wagon, station wagon, wagon, estate car, beach waggon, station
waggon, waggon
'437': beacon, lighthouse, beacon light, pharos
'438': beaker
'439': bearskin, busby, shako
'440': beer bottle
'441': beer glass
'442': bell cote, bell cot
'443': bib
'444': bicycle-built-for-two, tandem bicycle, tandem
'445': bikini, two-piece
'446': binder, ring-binder
'447': binoculars, field glasses, opera glasses
'448': birdhouse
'449': boathouse
'450': bobsled, bobsleigh, bob
'451': bolo tie, bolo, bola tie, bola
'452': bonnet, poke bonnet
'453': bookcase
'454': bookshop, bookstore, bookstall
'455': bottlecap
'456': bow
'457': bow tie, bow-tie, bowtie
'458': brass, memorial tablet, plaque
'459': brassiere, bra, bandeau
'460': breakwater, groin, groyne, mole, bulwark, seawall, jetty
'461': breastplate, aegis, egis
'462': broom
'463': bucket, pail
'464': buckle
'465': bulletproof vest
'466': bullet train, bullet
'467': butcher shop, meat market
'468': cab, hack, taxi, taxicab
'469': caldron, cauldron
'470': candle, taper, wax light
'471': cannon
'472': canoe
'473': can opener, tin opener
'474': cardigan
'475': car mirror
'476': carousel, carrousel, merry-go-round, roundabout, whirligig
'477': carpenter's kit, tool kit
'478': carton
'479': car wheel
'480': cash machine, cash dispenser, automated teller machine, automatic
teller machine, automated teller, automatic teller, ATM
'481': cassette
'482': cassette player
'483': castle
'484': catamaran
'485': CD player
'486': cello, violoncello
'487': cellular telephone, cellular phone, cellphone, cell, mobile phone
'488': chain
'489': chainlink fence
'490': chain mail, ring mail, mail, chain armor, chain armour, ring armor,
ring armour
'491': chain saw, chainsaw
'492': chest
'493': chiffonier, commode
'494': chime, bell, gong
'495': china cabinet, china closet
'496': Christmas stocking
'497': church, church building
'498': cinema, movie theater, movie theatre, movie house, picture palace
'499': cleaver, meat cleaver, chopper
'500': cliff dwelling
'501': cloak
'502': clog, geta, patten, sabot
'503': cocktail shaker
'504': coffee mug
'505': coffeepot
'506': coil, spiral, volute, whorl, helix
'507': combination lock
'508': computer keyboard, keypad
'509': confectionery, confectionary, candy store
'510': container ship, containership, container vessel
'511': convertible
'512': corkscrew, bottle screw
'513': cornet, horn, trumpet, trump
'514': cowboy boot
'515': cowboy hat, ten-gallon hat
'516': cradle
'517': crane2
'518': crash helmet
'519': crate
'520': crib, cot
'521': Crock Pot
'522': croquet ball
'523': crutch
'524': cuirass
'525': dam, dike, dyke
'526': desk
'527': desktop computer
'528': dial telephone, dial phone
'529': diaper, nappy, napkin
'530': digital clock
'531': digital watch
'532': dining table, board
'533': dishrag, dishcloth
'534': dishwasher, dish washer, dishwashing machine
'535': disk brake, disc brake
'536': dock, dockage, docking facility
'537': dogsled, dog sled, dog sleigh
'538': dome
'539': doormat, welcome mat
'540': drilling platform, offshore rig
'541': drum, membranophone, tympan
'542': drumstick
'543': dumbbell
'544': Dutch oven
'545': electric fan, blower
'546': electric guitar
'547': electric locomotive
'548': entertainment center
'549': envelope
'550': espresso maker
'551': face powder
'552': feather boa, boa
'553': file, file cabinet, filing cabinet
'554': fireboat
'555': fire engine, fire truck
'556': fire screen, fireguard
'557': flagpole, flagstaff
'558': flute, transverse flute
'559': folding chair
'560': football helmet
'561': forklift
'562': fountain
'563': fountain pen
'564': four-poster
'565': freight car
'566': French horn, horn
'567': frying pan, frypan, skillet
'568': fur coat
'569': garbage truck, dustcart
'570': gasmask, respirator, gas helmet
'571': gas pump, gasoline pump, petrol pump, island dispenser
'572': goblet
'573': go-kart
'574': golf ball
'575': golfcart, golf cart
'576': gondola
'577': gong, tam-tam
'578': gown
'579': grand piano, grand
'580': greenhouse, nursery, glasshouse
'581': grille, radiator grille
'582': grocery store, grocery, food market, market
'583': guillotine
'584': hair slide
'585': hair spray
'586': half track
'587': hammer
'588': hamper
'589': hand blower, blow dryer, blow drier, hair dryer, hair drier
'590': hand-held computer, hand-held microcomputer
'591': handkerchief, hankie, hanky, hankey
'592': hard disc, hard disk, fixed disk
'593': harmonica, mouth organ, harp, mouth harp
'594': harp
'595': harvester, reaper
'596': hatchet
'597': holster
'598': home theater, home theatre
'599': honeycomb
'600': hook, claw
'601': hoopskirt, crinoline
'602': horizontal bar, high bar
'603': horse cart, horse-cart
'604': hourglass
'605': iPod
'606': iron, smoothing iron
'607': jack-o'-lantern
'608': jean, blue jean, denim
'609': jeep, landrover
'610': jersey, T-shirt, tee shirt
'611': jigsaw puzzle
'612': jinrikisha, ricksha, rickshaw
'613': joystick
'614': kimono
'615': knee pad
'616': knot
'617': lab coat, laboratory coat
'618': ladle
'619': lampshade, lamp shade
'620': laptop, laptop computer
'621': lawn mower, mower
'622': lens cap, lens cover
'623': letter opener, paper knife, paperknife
'624': library
'625': lifeboat
'626': lighter, light, igniter, ignitor
'627': limousine, limo
'628': liner, ocean liner
'629': lipstick, lip rouge
'630': Loafer
'631': lotion
'632': loudspeaker, speaker, speaker unit, loudspeaker system, speaker system
'633': loupe, jeweler's loupe
'634': lumbermill, sawmill
'635': magnetic compass
'636': mailbag, postbag
'637': mailbox, letter box
'638': maillot
'639': maillot, tank suit
'640': manhole cover
'641': maraca
'642': marimba, xylophone
'643': mask
'644': matchstick
'645': maypole
'646': maze, labyrinth
'647': measuring cup
'648': medicine chest, medicine cabinet
'649': megalith, megalithic structure
'650': microphone, mike
'651': microwave, microwave oven
'652': military uniform
'653': milk can
'654': minibus
'655': miniskirt, mini
'656': minivan
'657': missile
'658': mitten
'659': mixing bowl
'660': mobile home, manufactured home
'661': Model T
'662': modem
'663': monastery
'664': monitor
'665': moped
'666': mortar
'667': mortarboard
'668': mosque
'669': mosquito net
'670': motor scooter, scooter
'671': mountain bike, all-terrain bike, off-roader
'672': mountain tent
'673': mouse, computer mouse
'674': mousetrap
'675': moving van
'676': muzzle
'677': nail
'678': neck brace
'679': necklace
'680': nipple
'681': notebook, notebook computer
'682': obelisk
'683': oboe, hautboy, hautbois
'684': ocarina, sweet potato
'685': odometer, hodometer, mileometer, milometer
'686': oil filter
'687': organ, pipe organ
'688': oscilloscope, scope, cathode-ray oscilloscope, CRO
'689': overskirt
'690': oxcart
'691': oxygen mask
'692': packet
'693': paddle, boat paddle
'694': paddlewheel, paddle wheel
'695': padlock
'696': paintbrush
'697': pajama, pyjama, pj's, jammies
'698': palace
'699': panpipe, pandean pipe, syrinx
'700': paper towel
'701': parachute, chute
'702': parallel bars, bars
'703': park bench
'704': parking meter
'705': passenger car, coach, carriage
'706': patio, terrace
'707': pay-phone, pay-station
'708': pedestal, plinth, footstall
'709': pencil box, pencil case
'710': pencil sharpener
'711': perfume, essence
'712': Petri dish
'713': photocopier
'714': pick, plectrum, plectron
'715': pickelhaube
'716': picket fence, paling
'717': pickup, pickup truck
'718': pier
'719': piggy bank, penny bank
'720': pill bottle
'721': pillow
'722': ping-pong ball
'723': pinwheel
'724': pirate, pirate ship
'725': pitcher, ewer
'726': plane, carpenter's plane, woodworking plane
'727': planetarium
'728': plastic bag
'729': plate rack
'730': plow, plough
'731': plunger, plumber's helper
'732': Polaroid camera, Polaroid Land camera
'733': pole
'734': police van, police wagon, paddy wagon, patrol wagon, wagon, black
Maria
'735': poncho
'736': pool table, billiard table, snooker table
'737': pop bottle, soda bottle
'738': pot, flowerpot
'739': potter's wheel
'740': power drill
'741': prayer rug, prayer mat
'742': printer
'743': prison, prison house
'744': projectile, missile
'745': projector
'746': puck, hockey puck
'747': punching bag, punch bag, punching ball, punchball
'748': purse
'749': quill, quill pen
'750': quilt, comforter, comfort, puff
'751': racer, race car, racing car
'752': racket, racquet
'753': radiator
'754': radio, wireless
'755': radio telescope, radio reflector
'756': rain barrel
'757': recreational vehicle, RV, R.V.
'758': reel
'759': reflex camera
'760': refrigerator, icebox
'761': remote control, remote
'762': restaurant, eating house, eating place, eatery
'763': revolver, six-gun, six-shooter
'764': rifle
'765': rocking chair, rocker
'766': rotisserie
'767': rubber eraser, rubber, pencil eraser
'768': rugby ball
'769': rule, ruler
'770': running shoe
'771': safe
'772': safety pin
'773': saltshaker, salt shaker
'774': sandal
'775': sarong
'776': sax, saxophone
'777': scabbard
'778': scale, weighing machine
'779': school bus
'780': schooner
'781': scoreboard
'782': screen, CRT screen
'783': screw
'784': screwdriver
'785': seat belt, seatbelt
'786': sewing machine
'787': shield, buckler
'788': shoe shop, shoe-shop, shoe store
'789': shoji
'790': shopping basket
'791': shopping cart
'792': shovel
'793': shower cap
'794': shower curtain
'795': ski
'796': ski mask
'797': sleeping bag
'798': slide rule, slipstick
'799': sliding door
'800': slot, one-armed bandit
'801': snorkel
'802': snowmobile
'803': snowplow, snowplough
'804': soap dispenser
'805': soccer ball
'806': sock
'807': solar dish, solar collector, solar furnace
'808': sombrero
'809': soup bowl
'810': space bar
'811': space heater
'812': space shuttle
'813': spatula
'814': speedboat
'815': spider web, spider's web
'816': spindle
'817': sports car, sport car
'818': spotlight, spot
'819': stage
'820': steam locomotive
'821': steel arch bridge
'822': steel drum
'823': stethoscope
'824': stole
'825': stone wall
'826': stopwatch, stop watch
'827': stove
'828': strainer
'829': streetcar, tram, tramcar, trolley, trolley car
'830': stretcher
'831': studio couch, day bed
'832': stupa, tope
'833': submarine, pigboat, sub, U-boat
'834': suit, suit of clothes
'835': sundial
'836': sunglass
'837': sunglasses, dark glasses, shades
'838': sunscreen, sunblock, sun blocker
'839': suspension bridge
'840': swab, swob, mop
'841': sweatshirt
'842': swimming trunks, bathing trunks
'843': swing
'844': switch, electric switch, electrical switch
'845': syringe
'846': table lamp
'847': tank, army tank, armored combat vehicle, armoured combat vehicle
'848': tape player
'849': teapot
'850': teddy, teddy bear
'851': television, television system
'852': tennis ball
'853': thatch, thatched roof
'854': theater curtain, theatre curtain
'855': thimble
'856': thresher, thrasher, threshing machine
'857': throne
'858': tile roof
'859': toaster
'860': tobacco shop, tobacconist shop, tobacconist
'861': toilet seat
'862': torch
'863': totem pole
'864': tow truck, tow car, wrecker
'865': toyshop
'866': tractor
'867': trailer truck, tractor trailer, trucking rig, rig, articulated lorry,
semi
'868': tray
'869': trench coat
'870': tricycle, trike, velocipede
'871': trimaran
'872': tripod
'873': triumphal arch
'874': trolleybus, trolley coach, trackless trolley
'875': trombone
'876': tub, vat
'877': turnstile
'878': typewriter keyboard
'879': umbrella
'880': unicycle, monocycle
'881': upright, upright piano
'882': vacuum, vacuum cleaner
'883': vase
'884': vault
'885': velvet
'886': vending machine
'887': vestment
'888': viaduct
'889': violin, fiddle
'890': volleyball
'891': waffle iron
'892': wall clock
'893': wallet, billfold, notecase, pocketbook
'894': wardrobe, closet, press
'895': warplane, military plane
'896': washbasin, handbasin, washbowl, lavabo, wash-hand basin
'897': washer, automatic washer, washing machine
'898': water bottle
'899': water jug
'900': water tower
'901': whiskey jug
'902': whistle
'903': wig
'904': window screen
'905': window shade
'906': Windsor tie
'907': wine bottle
'908': wing
'909': wok
'910': wooden spoon
'911': wool, woolen, woollen
'912': worm fence, snake fence, snake-rail fence, Virginia fence
'913': wreck
'914': yawl
'915': yurt
'916': web site, website, internet site, site
'917': comic book
'918': crossword puzzle, crossword
'919': street sign
'920': traffic light, traffic signal, stoplight
'921': book jacket, dust cover, dust jacket, dust wrapper
'922': menu
'923': plate
'924': guacamole
'925': consomme
'926': hot pot, hotpot
'927': trifle
'928': ice cream, icecream
'929': ice lolly, lolly, lollipop, popsicle
'930': French loaf
'931': bagel, beigel
'932': pretzel
'933': cheeseburger
'934': hotdog, hot dog, red hot
'935': mashed potato
'936': head cabbage
'937': broccoli
'938': cauliflower
'939': zucchini, courgette
'940': spaghetti squash
'941': acorn squash
'942': butternut squash
'943': cucumber, cuke
'944': artichoke, globe artichoke
'945': bell pepper
'946': cardoon
'947': mushroom
'948': Granny Smith
'949': strawberry
'950': orange
'951': lemon
'952': fig
'953': pineapple, ananas
'954': banana
'955': jackfruit, jak, jack
'956': custard apple
'957': pomegranate
'958': hay
'959': carbonara
'960': chocolate sauce, chocolate syrup
'961': dough
'962': meat loaf, meatloaf
'963': pizza, pizza pie
'964': potpie
'965': burrito
'966': red wine
'967': espresso
'968': cup
'969': eggnog
'970': alp
'971': bubble
'972': cliff, drop, drop-off
'973': coral reef
'974': geyser
'975': lakeside, lakeshore
'976': promontory, headland, head, foreland
'977': sandbar, sand bar
'978': seashore, coast, seacoast, sea-coast
'979': valley, vale
'980': volcano
'981': ballplayer, baseball player
'982': groom, bridegroom
'983': scuba diver
'984': rapeseed
'985': daisy
'986': yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus,
Cypripedium parviflorum
'987': corn
'988': acorn
'989': hip, rose hip, rosehip
'990': buckeye, horse chestnut, conker
'991': coral fungus
'992': agaric
'993': gyromitra
'994': stinkhorn, carrion fungus
'995': earthstar
'996': hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola
frondosa
'997': bolete
'998': ear, spike, capitulum
'999': toilet tissue, toilet paper, bathroom tissue
splits:
- name: train
num_bytes: 43316148.77032365
num_examples: 255
download_size: 42640144
dataset_size: 43316148.77032365
---
|
wmt/wmt19 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- cs
- de
- en
- fi
- fr
- gu
- kk
- lt
- ru
- zh
license:
- unknown
multilinguality:
- translation
size_categories:
- 10M<n<100M
source_datasets:
- extended|europarl_bilingual
- extended|news_commentary
- extended|opus_paracrawl
- extended|un_multi
task_categories:
- translation
task_ids: []
pretty_name: WMT19
dataset_info:
- config_name: cs-en
features:
- name: translation
dtype:
translation:
languages:
- cs
- en
splits:
- name: train
num_bytes: 1314866170
num_examples: 7270695
- name: validation
num_bytes: 696221
num_examples: 2983
download_size: 665590448
dataset_size: 1315562391
- config_name: de-en
features:
- name: translation
dtype:
translation:
languages:
- de
- en
splits:
- name: train
num_bytes: 7645655677
num_examples: 34782245
- name: validation
num_bytes: 757641
num_examples: 2998
download_size: 4079732256
dataset_size: 7646413318
- config_name: fi-en
features:
- name: translation
dtype:
translation:
languages:
- fi
- en
splits:
- name: train
num_bytes: 1422916995
num_examples: 6587448
- name: validation
num_bytes: 691833
num_examples: 3000
download_size: 739629820
dataset_size: 1423608828
- config_name: fr-de
features:
- name: translation
dtype:
translation:
languages:
- fr
- de
splits:
- name: train
num_bytes: 2358405621
num_examples: 9824476
- name: validation
num_bytes: 441418
num_examples: 1512
download_size: 1261830726
dataset_size: 2358847039
- config_name: gu-en
features:
- name: translation
dtype:
translation:
languages:
- gu
- en
splits:
- name: train
num_bytes: 590747
num_examples: 11670
- name: validation
num_bytes: 774613
num_examples: 1998
download_size: 730223
dataset_size: 1365360
- config_name: kk-en
features:
- name: translation
dtype:
translation:
languages:
- kk
- en
splits:
- name: train
num_bytes: 9157334
num_examples: 126583
- name: validation
num_bytes: 846849
num_examples: 2066
download_size: 5759291
dataset_size: 10004183
- config_name: lt-en
features:
- name: translation
dtype:
translation:
languages:
- lt
- en
splits:
- name: train
num_bytes: 513082481
num_examples: 2344893
- name: validation
num_bytes: 541945
num_examples: 2000
download_size: 284890393
dataset_size: 513624426
- config_name: ru-en
features:
- name: translation
dtype:
translation:
languages:
- ru
- en
splits:
- name: train
num_bytes: 13721347178
num_examples: 37492126
- name: validation
num_bytes: 1085588
num_examples: 3000
download_size: 6167016481
dataset_size: 13722432766
- config_name: zh-en
features:
- name: translation
dtype:
translation:
languages:
- zh
- en
splits:
- name: train
num_bytes: 6391177013
num_examples: 25984574
- name: validation
num_bytes: 1107514
num_examples: 3981
download_size: 3615575187
dataset_size: 6392284527
configs:
- config_name: cs-en
data_files:
- split: train
path: cs-en/train-*
- split: validation
path: cs-en/validation-*
- config_name: de-en
data_files:
- split: train
path: de-en/train-*
- split: validation
path: de-en/validation-*
- config_name: fi-en
data_files:
- split: train
path: fi-en/train-*
- split: validation
path: fi-en/validation-*
- config_name: fr-de
data_files:
- split: train
path: fr-de/train-*
- split: validation
path: fr-de/validation-*
- config_name: gu-en
data_files:
- split: train
path: gu-en/train-*
- split: validation
path: gu-en/validation-*
- config_name: kk-en
data_files:
- split: train
path: kk-en/train-*
- split: validation
path: kk-en/validation-*
- config_name: lt-en
data_files:
- split: train
path: lt-en/train-*
- split: validation
path: lt-en/validation-*
- config_name: ru-en
data_files:
- split: train
path: ru-en/train-*
- split: validation
path: ru-en/validation-*
- config_name: zh-en
data_files:
- split: train
path: zh-en/train-*
- split: validation
path: zh-en/validation-*
---
# Dataset Card for "wmt19"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://www.statmt.org/wmt19/translation-task.html](http://www.statmt.org/wmt19/translation-task.html)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 2.02 GB
- **Size of the generated dataset:** 1.32 GB
- **Total amount of disk used:** 3.33 GB
### Dataset Summary
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Warning:</b> There are issues with the Common Crawl corpus data (<a href="https://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz">training-parallel-commoncrawl.tgz</a>):</p>
<ul>
<li>Non-English files contain many English sentences.</li>
<li>Their "parallel" sentences in English are not aligned: they are uncorrelated with their counterpart.</li>
</ul>
<p>We have contacted the WMT organizers, and in response, they have indicated that they do not have plans to update the Common Crawl corpus data. Their rationale pertains to the expectation that such data has been superseded, primarily by CCMatrix, and to some extent, by ParaCrawl datasets.</p>
</div>
Translation dataset based on the data from statmt.org.
Versions exist for different years using a combination of data
sources. The base `wmt` allows you to create a custom dataset by choosing
your own data/language pair. This can be done as follows:
```python
from datasets import inspect_dataset, load_dataset_builder
inspect_dataset("wmt19", "path/to/scripts")
builder = load_dataset_builder(
"path/to/scripts/wmt_utils.py",
language_pair=("fr", "de"),
subsets={
datasets.Split.TRAIN: ["commoncrawl_frde"],
datasets.Split.VALIDATION: ["euelections_dev2019"],
},
)
# Standard version
builder.download_and_prepare()
ds = builder.as_dataset()
# Streamable version
ds = builder.as_streaming_dataset()
```
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### cs-en
- **Size of downloaded dataset files:** 2.02 GB
- **Size of the generated dataset:** 1.32 GB
- **Total amount of disk used:** 3.33 GB
An example of 'validation' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### cs-en
- `translation`: a multilingual `string` variable, with possible languages including `cs`, `en`.
### Data Splits
|name | train |validation|
|-----|------:|---------:|
|cs-en|7270695| 2983|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@ONLINE {wmt19translate,
author = "Wikimedia Foundation",
title = "ACL 2019 Fourth Conference on Machine Translation (WMT19), Shared Task: Machine Translation of News",
url = "http://www.statmt.org/wmt19/translation-task.html"
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
jordonpeter01/fuego-20230902-041357-9d81ce | ---
tags:
- fuego
fuego:
id: 20230902-041357-9d81ce
status: done
script: run_glue.py
requirements_file: requirements.txt
space_id: jordonpeter01/fuego-20230902-041357-9d81ce
space_hardware: cpu-basic
github_repo_id: huggingface/transformers
github_repo_branch: main
github_repo_sha: 0afa5071bd84e44301750fdc594e33db102cf374
---
|
UnderstandLing/oasst1_de | ---
license: apache-2.0
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 89117803
num_examples: 81167
- name: validation
num_bytes: 3382088
num_examples: 3001
download_size: 31597623
dataset_size: 92499891
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
anan-2024/twitter_dataset_1713193509 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20341
num_examples: 48
download_size: 13253
dataset_size: 20341
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
suneeln-duke/duke-qa-pairs | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 74400906
num_examples: 87867
- name: valid
num_bytes: 9297440
num_examples: 10637
download_size: 19068445
dataset_size: 83698346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
|
open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16 | ---
pretty_name: Evaluation run of TheBloke/landmark-attention-llama7b-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/landmark-attention-llama7b-fp16](https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T21:06:08.838189](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16/blob/main/results_2023-10-22T21-06-08.838189.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298539,\n \"f1\": 0.04697252516778534,\n\
\ \"f1_stderr\": 0.0013361369387872978,\n \"acc\": 0.34813421471026634,\n\
\ \"acc_stderr\": 0.008277173895027065\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298539,\n\
\ \"f1\": 0.04697252516778534,\n \"f1_stderr\": 0.0013361369387872978\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890015\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6803472770323599,\n \"acc_stderr\": 0.01310652851766513\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|arc:challenge|25_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T21_06_08.838189
path:
- '**/details_harness|drop|3_2023-10-22T21-06-08.838189.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T21-06-08.838189.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T21_06_08.838189
path:
- '**/details_harness|gsm8k|5_2023-10-22T21-06-08.838189.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T21-06-08.838189.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hellaswag|10_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T15:07:15.770295.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T15:07:15.770295.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T21_06_08.838189
path:
- '**/details_harness|winogrande|5_2023-10-22T21-06-08.838189.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T21-06-08.838189.parquet'
- config_name: results
data_files:
- split: 2023_07_31T15_07_15.770295
path:
- results_2023-07-31T15:07:15.770295.parquet
- split: 2023_10_22T21_06_08.838189
path:
- results_2023-10-22T21-06-08.838189.parquet
- split: latest
path:
- results_2023-10-22T21-06-08.838189.parquet
---
# Dataset Card for Evaluation run of TheBloke/landmark-attention-llama7b-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/landmark-attention-llama7b-fp16](https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T21:06:08.838189](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16/blob/main/results_2023-10-22T21-06-08.838189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298539,
"f1": 0.04697252516778534,
"f1_stderr": 0.0013361369387872978,
"acc": 0.34813421471026634,
"acc_stderr": 0.008277173895027065
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298539,
"f1": 0.04697252516778534,
"f1_stderr": 0.0013361369387872978
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890015
},
"harness|winogrande|5": {
"acc": 0.6803472770323599,
"acc_stderr": 0.01310652851766513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-preference-64-nsample-16_random | ---
dataset_info:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25791877
num_examples: 20001
download_size: 12310829
dataset_size: 25791877
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25837484
num_examples: 20001
download_size: 12262392
dataset_size: 25837484
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25779381
num_examples: 20001
download_size: 11985077
dataset_size: 25779381
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: preference
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
---
|
AdapterOcean/code_instructions_standardized_cluster_7 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 91956578
num_examples: 8650
download_size: 28080402
dataset_size: 91956578
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
devingulliver/dolma-v1_6-sample | ---
dataset_info:
features:
- name: provenance
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 34959777487
num_examples: 13095416
download_size: 20602021674
dataset_size: 34959777487
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HackerNoon/tech-company-news-data-dump | ---
license: mit
task_categories:
- text-classification
- summarization
language:
- en
size_categories:
- 1M<n<10M
tags:
- news
- technology news
- company news
- tech company news
- tech news
- technology company news
- tech company blogs
- technology company blogs
- hackernoon
- hacker noon
- news curation
- tech news curation
- tech company news curation
- technology company news curation
- tech blog curation
- technology blog curation
- brave search api
- bing news api
- hackernoon api
- hacker noon api
- tech company news api
- technology company news api
---
[HackerNoon](https://hackernoon.com) curated the internet's most cited 7M+ tech company news articles and blog posts about the 3k+ most valuable tech companies in 2022 and 2023. These stories were curated to power [HackerNoon.com/Companies](https://hackernoon.com/companies), where we update daily news on top technology companies like [Microsoft](https://hackernoon.com/company/microsoft), [Google](https://hackernoon.com/company/google), and [HuggingFace](https://hackernoon.com/company/huggingface). Please use this news data freely for your project, and as always anyone is welcome to [publish on HackerNoon](https://hackernoon.com/p/publish). |
aai530-group6/ddxplus-french | ---
language:
- fr
license: cc-by-4.0
license_link: https://creativecommons.org/licenses/by/4.0/
tags:
- automatic-diagnosis
- automatic-symptom-detection
- differential-diagnosis
- synthetic-patients
- diseases
- health-care
pretty_name: DDXPlus
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- tabular-classification
task_ids:
- multi-class-classification
paperswithcode_id: ddxplus
configs:
- config_name: default
data_files:
- split: train
path: "train.csv"
- split: test
path: "test.csv"
- split: validate
path: "validate.csv"
extra_gated_prompt: "By accessing this dataset, you agree to use it solely for research purposes and not for clinical decision-making."
extra_gated_fields:
Consent: checkbox
Purpose of use:
type: select
options:
- Research
- Educational
- label: Other
value: other
train-eval-index:
- config: default
task: medical-diagnosis
task_id: binary-classification
splits:
train_split: train
eval_split: validate
col_mapping:
AGE: AGE
SEX: SEX
PATHOLOGY: PATHOLOGY
EVIDENCES: EVIDENCES
INITIAL_EVIDENCE: INITIAL_EVIDENCE
DIFFERENTIAL_DIAGNOSIS: DIFFERENTIAL_DIAGNOSIS
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 Score
---
# Dataset Description
We are releasing under the CC-BY licence a new large-scale dataset for Automatic Symptom Detection (ASD) and Automatic Diagnosis (AD) systems in the medical domain. The dataset contains patients synthesized using a proprietary medical knowledge base and a commercial rule-based AD system. Patients in the dataset are characterized by their socio-demographic data, a pathology they are suffering from, a set of symptoms and antecedents related to this pathology, and a differential diagnosis. The symptoms and antecedents can be binary, categorical and multi-choice, with the potential of leading to more efficient and natural interactions between ASD/AD systems and patients. To the best of our knowledge, this is the first large-scale dataset that includes the differential diagnosis, and non-binary symptoms and antecedents.
**Note**: We use evidence as a general term to refer to a symptom or an antecedent.
This directory contains the following files:
- **release_evidences.json**: a JSON file describing all possible evidences considered in the dataset.
- **release_conditions.json**: a JSON file describing all pathologies considered in the dataset.
- **release_train_patients.zip**: a CSV file containing the patients of the training set.
- **release_validate_patients.zip**: a CSV file containing the patients of the validation set.
- **release_test_patients.zip**: a CSV file containing the patients of the test set.
## Evidence Description
Each evidence in the `release_evidences.json` file is described using the following entries:
- **name**: name of the evidence.
- **code_question**: a code allowing to identify which evidences are related. Evidences having the same `code_question` form a group of related symptoms. The value of the `code_question` refers to the evidence that need to be simulated/activated for the other members of the group to be eventually simulated.
- **question_fr**: the query, in French, associated to the evidence.
- **question_en**: the query, in English, associated to the evidence.
- **is_antecedent**: a flag indicating whether the evidence is an antecedent or a symptom.
- **data_type**: the type of evidence. We use `B` for binary, `C` for categorical, and `M` for multi-choice evidences.
- **default_value**: the default value of the evidence. If this value is used to characterize the evidence, then it is as if the evidence was not synthesized.
- **possible-values**: the possible values for the evidences. Only valid for categorical and multi-choice evidences.
- **value_meaning**: The meaning, in French and English, of each code that is part of the `possible-values` field. Only valid for categorical and multi-choice evidences.
## Pathology Description
The file `release_conditions.json` contains information about the pathologies that patients in the datasets may suffer from. Each pathology has the following attributes:
- **condition_name**: name of the pathology.
- **cond-name-fr**: name of the pathology in French.
- **cond-name-eng**: name of the pathology in English.
- **icd10-id**: ICD-10 code of the pathology.
- **severity**: the severity associated with the pathology. The lower the more severe.
- **symptoms**: data structure describing the set of symptoms characterizing the pathology. Each symptom is represented by its corresponding `name` entry in the `release_evidences.json` file.
- **antecedents**: data structure describing the set of antecedents characterizing the pathology. Each antecedent is represented by its corresponding `name` entry in the `release_evidences.json` file.
## Patient Description
Each patient in each of the 3 sets has the following attributes:
- **AGE**: the age of the synthesized patient.
- **SEX**: the sex of the synthesized patient.
- **PATHOLOGY**: name of the ground truth pathology (`condition_name` property in the `release_conditions.json` file) that the synthesized patient is suffering from.
- **EVIDENCES**: list of evidences experienced by the patient. An evidence can either be binary, categorical or multi-choice. A categorical or multi-choice evidence is represented in the format `[evidence-name]_@_[evidence-value]` where [`evidence-name`] is the name of the evidence (`name` entry in the `release_evidences.json` file) and [`evidence-value`] is a value from the `possible-values` entry. Note that for a multi-choice evidence, it is possible to have several `[evidence-name]_@_[evidence-value]` items in the evidence list, with each item being associated with a different evidence value. A binary evidence is represented as `[evidence-name]`.
- **INITIAL_EVIDENCE**: the evidence provided by the patient to kick-start an interaction with an ASD/AD system. This is useful during model evaluation for a fair comparison of ASD/AD systems as they will all begin an interaction with a given patient from the same starting point. The initial evidence is randomly selected from the binary evidences found in the evidence list mentioned above (i.e., `EVIDENCES`) and it is part of this list.
- **DIFFERENTIAL_DIAGNOSIS**: The ground truth differential diagnosis for the patient. It is represented as a list of pairs of the form `[[patho_1, proba_1], [patho_2, proba_2], ...]` where `patho_i` is the pathology name (`condition_name` entry in the `release_conditions.json` file) and `proba_i` is its related probability.
## Note:
We hope this dataset will encourage future works for ASD and AD systems that consider the differential diagnosis and the severity of pathologies. It is important to keep in mind that this dataset is formed of synthetic patients and is meant for research purposes. Given the assumptions made during the generation process of this dataset, we would like to emphasize that the dataset should not be used to train and deploy a model prior to performing rigorous evaluations of the model performance and verifying that the system has proper coverage and representation of the population that it will interact with.
It is important to understand that the level of specificity, sensitivity and confidence that a physician will seek when evaluating a patient will be influenced by the clinical setting. The dataset was built for acute care and biased toward high mortality and morbidity pathologies. Physicians will tend to consider negative evidences as equally important in such a clinical context in order to evaluate high acuity diseases.
In the creation of the DDXPlus dataset, a small subset of the diseases was chosen to establish a baseline. Medical professionals have to consider this very important point when reviewing the results of models trained with this dataset, as the differential is considerably smaller. A smaller differential means less potential evidences to collect. It is thus essential to understand this point when we look at the differential produced and the evidence collected by a model based on this dataset.
For more information, please check our [paper](https://arxiv.org/abs/2205.09148). |
sanket03/midjourney_small | ---
license: mit
---
|
tanvirsrbd1/nov1_without_annotation | ---
dataset_info:
features:
- name: id
dtype: string
- name: xml
dtype: string
- name: html
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 47175891
num_examples: 1711
download_size: 5629525
dataset_size: 47175891
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "nov1_without_annotation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tr416/dataset_20231007_024652 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 74082
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231007_024652"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mtc/WCEP-filtered | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 10823988
num_examples: 370
download_size: 5149647
dataset_size: 10823988
---
# Dataset Card for "WCEP-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3 | ---
pretty_name: Evaluation run of aloobun/open-llama-3b-v2-elmv3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aloobun/open-llama-3b-v2-elmv3](https://huggingface.co/aloobun/open-llama-3b-v2-elmv3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T18:25:59.224844](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3/blob/main/results_2023-12-09T18-25-59.224844.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2804692579613333,\n\
\ \"acc_stderr\": 0.03160774886030324,\n \"acc_norm\": 0.28199113779250456,\n\
\ \"acc_norm_stderr\": 0.0323576565422058,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3550624387136162,\n\
\ \"mc2_stderr\": 0.01364292328900912\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3873720136518771,\n \"acc_stderr\": 0.014235872487909874,\n\
\ \"acc_norm\": 0.42150170648464164,\n \"acc_norm_stderr\": 0.014430197069326023\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.551185022903804,\n\
\ \"acc_stderr\": 0.004963567029129055,\n \"acc_norm\": 0.7326229834694284,\n\
\ \"acc_norm_stderr\": 0.004416861919100999\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137282,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137282\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307811,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307811\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843673,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843673\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747549,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747549\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325628,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325628\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333339,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333339\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358611,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358611\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.02300062824368796,\n\
\ \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.02300062824368796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258533,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258533\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094476,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094476\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n\
\ \"acc_stderr\": 0.016160871405127532,\n \"acc_norm\": 0.28607918263090676,\n\
\ \"acc_norm_stderr\": 0.016160871405127532\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843017,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843017\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827065,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827065\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n\
\ \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3550624387136162,\n\
\ \"mc2_stderr\": 0.01364292328900912\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6495659037095501,\n \"acc_stderr\": 0.013409047676670184\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.037149355572403335,\n \
\ \"acc_stderr\": 0.0052095162830737675\n }\n}\n```"
repo_url: https://huggingface.co/aloobun/open-llama-3b-v2-elmv3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|arc:challenge|25_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|gsm8k|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hellaswag|10_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-18-30.999840.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-25-59.224844.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- '**/details_harness|winogrande|5_2023-12-09T17-18-30.999840.parquet'
- split: 2023_12_09T18_25_59.224844
path:
- '**/details_harness|winogrande|5_2023-12-09T18-25-59.224844.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T18-25-59.224844.parquet'
- config_name: results
data_files:
- split: 2023_12_09T17_18_30.999840
path:
- results_2023-12-09T17-18-30.999840.parquet
- split: 2023_12_09T18_25_59.224844
path:
- results_2023-12-09T18-25-59.224844.parquet
- split: latest
path:
- results_2023-12-09T18-25-59.224844.parquet
---
# Dataset Card for Evaluation run of aloobun/open-llama-3b-v2-elmv3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/aloobun/open-llama-3b-v2-elmv3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [aloobun/open-llama-3b-v2-elmv3](https://huggingface.co/aloobun/open-llama-3b-v2-elmv3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:25:59.224844](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3/blob/main/results_2023-12-09T18-25-59.224844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2804692579613333,
"acc_stderr": 0.03160774886030324,
"acc_norm": 0.28199113779250456,
"acc_norm_stderr": 0.0323576565422058,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3550624387136162,
"mc2_stderr": 0.01364292328900912
},
"harness|arc:challenge|25": {
"acc": 0.3873720136518771,
"acc_stderr": 0.014235872487909874,
"acc_norm": 0.42150170648464164,
"acc_norm_stderr": 0.014430197069326023
},
"harness|hellaswag|10": {
"acc": 0.551185022903804,
"acc_stderr": 0.004963567029129055,
"acc_norm": 0.7326229834694284,
"acc_norm_stderr": 0.004416861919100999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307811,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307811
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843673,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843673
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325628,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325628
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333339,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333339
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358611,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358611
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28974358974358977,
"acc_stderr": 0.02300062824368796,
"acc_norm": 0.28974358974358977,
"acc_norm_stderr": 0.02300062824368796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.04620284082280039,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.04620284082280039
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094476,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094476
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127532,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127532
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843017,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843017
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827065,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827065
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3550624387136162,
"mc2_stderr": 0.01364292328900912
},
"harness|winogrande|5": {
"acc": 0.6495659037095501,
"acc_stderr": 0.013409047676670184
},
"harness|gsm8k|5": {
"acc": 0.037149355572403335,
"acc_stderr": 0.0052095162830737675
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/volga_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of volga/ヴォルガ/伏尔加 (Azur Lane)
This is the dataset of volga/ヴォルガ/伏尔加 (Azur Lane), containing 42 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, hat, yellow_eyes, white_headwear, hair_between_eyes, red_hair, bangs, fur_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 71.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/volga_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 37.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/volga_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 106 | 80.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/volga_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 63.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/volga_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 106 | 126.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/volga_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/volga_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_gloves, cleavage, fur-trimmed_coat, looking_at_viewer, solo, white_coat, blush, open_mouth, white_dress, earrings, papakha, pink_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | cleavage | fur-trimmed_coat | looking_at_viewer | solo | white_coat | blush | open_mouth | white_dress | earrings | papakha | pink_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:-------------------|:--------------------|:-------|:-------------|:--------|:-------------|:--------------|:-----------|:----------|:------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO | ---
pretty_name: Evaluation run of Locutusque/ChatHercules-2.5-Mistral-7B-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/ChatHercules-2.5-Mistral-7B-DPO](https://huggingface.co/Locutusque/ChatHercules-2.5-Mistral-7B-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T02:35:25.349975](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO/blob/main/results_2024-03-10T02-35-25.349975.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65434321085394,\n\
\ \"acc_stderr\": 0.031825284831705845,\n \"acc_norm\": 0.655257152354157,\n\
\ \"acc_norm_stderr\": 0.03247573899311495,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.522996054505985,\n\
\ \"mc2_stderr\": 0.014861512019306897\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6532563234415455,\n\
\ \"acc_stderr\": 0.004749606196363343,\n \"acc_norm\": 0.8540131447918742,\n\
\ \"acc_norm_stderr\": 0.0035237141526513\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"\
acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.03031371053819889,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.03031371053819889\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497593,\n \
\ \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348406,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348406\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48370273794002605,\n\
\ \"acc_stderr\": 0.01276345073469982,\n \"acc_norm\": 0.48370273794002605,\n\
\ \"acc_norm_stderr\": 0.01276345073469982\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.02725720260611494,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.02725720260611494\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.522996054505985,\n\
\ \"mc2_stderr\": 0.014861512019306897\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613983\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6755117513267627,\n \
\ \"acc_stderr\": 0.012896095359768111\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/ChatHercules-2.5-Mistral-7B-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|arc:challenge|25_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|gsm8k|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hellaswag|10_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|winogrande|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T02-35-25.349975.parquet'
- config_name: results
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- results_2024-03-10T02-35-25.349975.parquet
- split: latest
path:
- results_2024-03-10T02-35-25.349975.parquet
---
# Dataset Card for Evaluation run of Locutusque/ChatHercules-2.5-Mistral-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/ChatHercules-2.5-Mistral-7B-DPO](https://huggingface.co/Locutusque/ChatHercules-2.5-Mistral-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T02:35:25.349975](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO/blob/main/results_2024-03-10T02-35-25.349975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.65434321085394,
"acc_stderr": 0.031825284831705845,
"acc_norm": 0.655257152354157,
"acc_norm_stderr": 0.03247573899311495,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.522996054505985,
"mc2_stderr": 0.014861512019306897
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6532563234415455,
"acc_stderr": 0.004749606196363343,
"acc_norm": 0.8540131447918742,
"acc_norm_stderr": 0.0035237141526513
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3814814814814815,
"acc_stderr": 0.029616718927497593,
"acc_norm": 0.3814814814814815,
"acc_norm_stderr": 0.029616718927497593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348406,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348406
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48370273794002605,
"acc_stderr": 0.01276345073469982,
"acc_norm": 0.48370273794002605,
"acc_norm_stderr": 0.01276345073469982
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.02725720260611494,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.02725720260611494
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.522996054505985,
"mc2_stderr": 0.014861512019306897
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613983
},
"harness|gsm8k|5": {
"acc": 0.6755117513267627,
"acc_stderr": 0.012896095359768111
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.