datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
MANMEET75/InfraBotData | ---
license: mit
---
|
ohhhchank3/TLCN_20133118 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10257413
num_examples: 20000
download_size: 5187570
dataset_size: 10257413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Matthijs/snacks-detection | ---
pretty_name: Snacks (Detection)
task_categories:
- object-detection
- computer-vision
license: cc-by-4.0
---
# Dataset Card for Snacks (Detection)
## Dataset Summary
This is a dataset of 20 different types of snack foods that accompanies the book [Machine Learning by Tutorials](https://www.raywenderlich.com/books/machine-learning-by-tutorials/v2.0).
The images were taken from the [Google Open Images dataset](https://storage.googleapis.com/openimages/web/index.html), release 2017_11.
## Dataset Structure
Included in the **data** folder are three CSV files with bounding box annotations for the images in the dataset, although not all images have annotations and some images have multiple annotations.
The columns in the CSV files are:
- `image_id`: the filename of the image without the .jpg extension
- `x_min, x_max, y_min, y_max`: normalized bounding box coordinates, i.e. in the range [0, 1]
- `class_name`: the class that belongs to the bounding box
- `folder`: the class that belongs to the image as a whole, which is also the name of the folder that contains the image
The class names are:
```nohighlight
apple
banana
cake
candy
carrot
cookie
doughnut
grape
hot dog
ice cream
juice
muffin
orange
pineapple
popcorn
pretzel
salad
strawberry
waffle
watermelon
```
**Note:** The image files are not part of this repo but [can be found here](https://huggingface.co/datasets/Matthijs/snacks).
### Data Splits
Train, Test, Validation
## Licensing Information
Just like the images from Google Open Images, the snacks dataset is licensed under the terms of the Creative Commons license.
The images are listed as having a [CC BY 2.0](https://creativecommons.org/licenses/by/2.0/) license.
The annotations are licensed by Google Inc. under a [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license.
|
pvisnrt/capstone_hal | ---
license: mit
dataset_info:
features:
- name: source
sequence: string
- name: summary_target
sequence: string
- name: tags
sequence:
class_label:
names:
'0': C
'1': M
'2': N
'3': O
'4': OB
'5': W
splits:
- name: train
num_bytes: 133158
num_examples: 80
- name: validation
num_bytes: 15058
num_examples: 10
- name: test
num_bytes: 22841
num_examples: 10
download_size: 50559
dataset_size: 171057
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3 | ---
pretty_name: Evaluation run of PY007/TinyLlama-1.1B-Chat-v0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PY007/TinyLlama-1.1B-Chat-v0.3](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T05:46:52.405812](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3/blob/main/results_2023-10-23T05-46-52.405812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0035654362416107383,\n\
\ \"em_stderr\": 0.0006104082299890309,\n \"f1\": 0.04627936241610745,\n\
\ \"f1_stderr\": 0.0012734567743311978,\n \"acc\": 0.2918883921652635,\n\
\ \"acc_stderr\": 0.00807629623065548\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0035654362416107383,\n \"em_stderr\": 0.0006104082299890309,\n\
\ \"f1\": 0.04627936241610745,\n \"f1_stderr\": 0.0012734567743311978\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.002267537102254483\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5769534333070244,\n \"acc_stderr\": 0.013885055359056474\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T05_46_52.405812
path:
- '**/details_harness|drop|3_2023-10-23T05-46-52.405812.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T05-46-52.405812.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T05_46_52.405812
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-46-52.405812.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-46-52.405812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-14-39.217680.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-14-39.217680.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T05_46_52.405812
path:
- '**/details_harness|winogrande|5_2023-10-23T05-46-52.405812.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T05-46-52.405812.parquet'
- config_name: results
data_files:
- split: 2023_10_04T07_14_39.217680
path:
- results_2023-10-04T07-14-39.217680.parquet
- split: 2023_10_23T05_46_52.405812
path:
- results_2023-10-23T05-46-52.405812.parquet
- split: latest
path:
- results_2023-10-23T05-46-52.405812.parquet
---
# Dataset Card for Evaluation run of PY007/TinyLlama-1.1B-Chat-v0.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PY007/TinyLlama-1.1B-Chat-v0.3](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T05:46:52.405812](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.3/blob/main/results_2023-10-23T05-46-52.405812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0035654362416107383,
"em_stderr": 0.0006104082299890309,
"f1": 0.04627936241610745,
"f1_stderr": 0.0012734567743311978,
"acc": 0.2918883921652635,
"acc_stderr": 0.00807629623065548
},
"harness|drop|3": {
"em": 0.0035654362416107383,
"em_stderr": 0.0006104082299890309,
"f1": 0.04627936241610745,
"f1_stderr": 0.0012734567743311978
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.002267537102254483
},
"harness|winogrande|5": {
"acc": 0.5769534333070244,
"acc_stderr": 0.013885055359056474
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tj-solergibert/Europarl-ST-processed-mt-ro | ---
dataset_info:
features:
- name: source_text
dtype: string
- name: dest_text
dtype: string
- name: dest_lang
dtype:
class_label:
names:
'0': de
'1': en
'2': es
'3': fr
'4': it
'5': nl
'6': pl
'7': pt
'8': ro
splits:
- name: train
num_bytes: 139150159
num_examples: 384704
- name: valid
num_bytes: 18067165
num_examples: 48280
- name: test
num_bytes: 19720811
num_examples: 53360
download_size: 66531208
dataset_size: 176938135
---
# Dataset Card for "Europarl-ST-processed-mt-ro"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
romjansen/robbert-base-v2-NER-NL-legislation-refs-data | ---
multilinguality:
- monolingual
task_categories:
- token-classification
task_ids:
- named-entity-recognition
train-eval-index:
- task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: test
val_split: validation
col_mapping:
tokens: tokens
ner_tags: tags
metrics:
- type: seqeval
name: seqeval
---
# Dataset description
This dataset was created for fine-tuning the model [robbert-base-v2-NER-NL-legislation-refs](https://huggingface.co/romjansen/robbert-base-v2-NER-NL-legislation-refs) and consists of 512 token long examples which each contain one or more legislation references. These examples were created from a weakly labelled corpus of Dutch case law which was scraped from [Linked Data Overheid](https://linkeddata.overheid.nl/), pre-tokenized and labelled ([biluo_tags_from_offsets](https://spacy.io/api/top-level#biluo_tags_from_offsets)) through [spaCy](https://spacy.io/) and further tokenized through applying Hugging Face's [AutoTokenizer.from_pretrained()](https://huggingface.co/docs/transformers/model_doc/auto#transformers.AutoTokenizer.from_pretrained) for [pdelobelle/robbert-v2-dutch-base](https://huggingface.co/pdelobelle/robbert-v2-dutch-base)'s tokenizer. |
open-llm-leaderboard/details_bigscience__bloomz | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloomz\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T12:14:13.875692](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz/blob/main/results_2023-08-29T12%3A14%3A13.875692.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47917755403252543,\n\
\ \"acc_stderr\": 0.03572101484290109,\n \"acc_norm\": 0.48335485520551164,\n\
\ \"acc_norm_stderr\": 0.0357085480998606,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.4393940961026447,\n\
\ \"mc2_stderr\": 0.015292532701908591\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5042662116040956,\n \"acc_stderr\": 0.014610858923956955,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5553674566819359,\n\
\ \"acc_stderr\": 0.004959094146471527,\n \"acc_norm\": 0.7523401712806214,\n\
\ \"acc_norm_stderr\": 0.004307709682499536\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983052,\n \
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.49696969696969695,\n \"acc_stderr\": 0.03904272341431857,\n\
\ \"acc_norm\": 0.49696969696969695,\n \"acc_norm_stderr\": 0.03904272341431857\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626303,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626303\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.616580310880829,\n \"acc_stderr\": 0.03508984236295342,\n\
\ \"acc_norm\": 0.616580310880829,\n \"acc_norm_stderr\": 0.03508984236295342\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.02531764972644865,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.02531764972644865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6623853211009174,\n \"acc_stderr\": 0.020275265986638924,\n \"\
acc_norm\": 0.6623853211009174,\n \"acc_norm_stderr\": 0.020275265986638924\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5196078431372549,\n \"acc_stderr\": 0.03506612560524866,\n \"\
acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.03506612560524866\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6232439335887612,\n\
\ \"acc_stderr\": 0.01732829290730305,\n \"acc_norm\": 0.6232439335887612,\n\
\ \"acc_norm_stderr\": 0.01732829290730305\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2770949720670391,\n\
\ \"acc_stderr\": 0.014968772435812145,\n \"acc_norm\": 0.2770949720670391,\n\
\ \"acc_norm_stderr\": 0.014968772435812145\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4919614147909968,\n\
\ \"acc_stderr\": 0.028394421370984545,\n \"acc_norm\": 0.4919614147909968,\n\
\ \"acc_norm_stderr\": 0.028394421370984545\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4691358024691358,\n \"acc_stderr\": 0.02776768960683393,\n\
\ \"acc_norm\": 0.4691358024691358,\n \"acc_norm_stderr\": 0.02776768960683393\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3116036505867014,\n\
\ \"acc_stderr\": 0.011829039182849648,\n \"acc_norm\": 0.3116036505867014,\n\
\ \"acc_norm_stderr\": 0.011829039182849648\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4950980392156863,\n \"acc_stderr\": 0.020226862710039473,\n \
\ \"acc_norm\": 0.4950980392156863,\n \"acc_norm_stderr\": 0.020226862710039473\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5671641791044776,\n\
\ \"acc_stderr\": 0.0350349092367328,\n \"acc_norm\": 0.5671641791044776,\n\
\ \"acc_norm_stderr\": 0.0350349092367328\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.03834234744164993,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.03834234744164993\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.4393940961026447,\n\
\ \"mc2_stderr\": 0.015292532701908591\n }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|arc:challenge|25_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hellaswag|10_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T12:14:13.875692.parquet'
- config_name: results
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- results_2023-08-29T12:14:13.875692.parquet
- split: latest
path:
- results_2023-08-29T12:14:13.875692.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloomz",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T12:14:13.875692](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz/blob/main/results_2023-08-29T12%3A14%3A13.875692.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47917755403252543,
"acc_stderr": 0.03572101484290109,
"acc_norm": 0.48335485520551164,
"acc_norm_stderr": 0.0357085480998606,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.4393940961026447,
"mc2_stderr": 0.015292532701908591
},
"harness|arc:challenge|25": {
"acc": 0.5042662116040956,
"acc_stderr": 0.014610858923956955,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.5553674566819359,
"acc_stderr": 0.004959094146471527,
"acc_norm": 0.7523401712806214,
"acc_norm_stderr": 0.004307709682499536
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983052,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.041349130183033156,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.041349130183033156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.49696969696969695,
"acc_stderr": 0.03904272341431857,
"acc_norm": 0.49696969696969695,
"acc_norm_stderr": 0.03904272341431857
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626303,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626303
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.616580310880829,
"acc_stderr": 0.03508984236295342,
"acc_norm": 0.616580310880829,
"acc_norm_stderr": 0.03508984236295342
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.02531764972644865,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.02531764972644865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6623853211009174,
"acc_stderr": 0.020275265986638924,
"acc_norm": 0.6623853211009174,
"acc_norm_stderr": 0.020275265986638924
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.03506612560524866,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.03506612560524866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6232439335887612,
"acc_stderr": 0.01732829290730305,
"acc_norm": 0.6232439335887612,
"acc_norm_stderr": 0.01732829290730305
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2770949720670391,
"acc_stderr": 0.014968772435812145,
"acc_norm": 0.2770949720670391,
"acc_norm_stderr": 0.014968772435812145
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984545,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4691358024691358,
"acc_stderr": 0.02776768960683393,
"acc_norm": 0.4691358024691358,
"acc_norm_stderr": 0.02776768960683393
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3116036505867014,
"acc_stderr": 0.011829039182849648,
"acc_norm": 0.3116036505867014,
"acc_norm_stderr": 0.011829039182849648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4950980392156863,
"acc_stderr": 0.020226862710039473,
"acc_norm": 0.4950980392156863,
"acc_norm_stderr": 0.020226862710039473
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5671641791044776,
"acc_stderr": 0.0350349092367328,
"acc_norm": 0.5671641791044776,
"acc_norm_stderr": 0.0350349092367328
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.03834234744164993,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.03834234744164993
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.4393940961026447,
"mc2_stderr": 0.015292532701908591
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
LahiruLowe/flan2021_filtered_3pertask | ---
dataset_info:
features:
- name: original_index
dtype: int64
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 216227
num_examples: 210
download_size: 0
dataset_size: 216227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "flan2021_filtered_3pertask"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/1M_SDXL_Refiner_Prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 710084590
num_examples: 1000000
download_size: 56382528
dataset_size: 710084590
license: apache-2.0
language:
- en
size_categories:
- 1M<n<10M
---
# 1 Million Prompts for SDXL Refiner: Text-to-Image Generating Prompts
**Dataset Name:** 1 Million Prompts for SDXL Refiner: Text-to-Image Generating Prompts
**Author:** Falah G. Salieh
**Description:** This dataset contains 1 million creative prompts designed for generating text-to-image refiner examples using the SDXL model. The prompts cover a wide range of art styles and creative applications, providing a diverse set of text descriptions for inspiring image generation. These prompts are curated to encourage imaginative and diverse visual outputs.
**Contributions:** Contributions to this dataset are welcome. Feel free to submit additional prompts, variations, or modifications to enhance the diversity and creativity of the generated images.
**Dataset Information:**
- **Features:**
- `prompts`: Textual descriptions of creative prompts.
- **Splits:**
- `train`: Training split with 1,000,000 examples.
- **Size:**
- Download Size: 56.38 MB
- Dataset Size: 710.08 MB
**Usage:**
This dataset is intended for use in refining and enhancing text-to-image generation models like SDXL. It can also be employed for training, fine-tuning, and evaluating such models for creative applications.
**Example:**
Here's an example prompt generated by the author:
```json
{
"prompt": "In the enchanting realm of creativity, skilled artists have mastered the art of capturing reality through their lens. Using the latest digital tools and software, these visionary women bring to life stunningly accurate portrayals of young Arabic female, expressing curiosity, detailed face, captivating, full elegant dress, flowing auburn locks, closeup, deep and captivating eyes, interior home background, photorealistic, highly detailed, soft and diffused lighting, concept art, (photography:2.5), sharp focus, digital art, award-winning, ultra high resolution, vibrant colors, mirrorless camera, neon lights, dynamic perspectives, impressionism. Imbuing every detail with meticulous precision, they seamlessly blend their subjects into various environments, capturing them from unique camera angles. The essence of Arabic female is masterfully evident in their work, evoking feelings of awe among viewers."
}
**Citation:**
@dataset{falah_salieh/1_million_prompts_sdxl_refiner,
title = "1 Million Prompts for SDXL Refiner: Text-to-Image Generating Prompts",
author = "Falah G. Salieh",
year = "2023",
publisher = "Hugging Face",
url = "https://huggingface.co/datasets/falah_salieh/1_million_prompts_sdxl_refiner"
} |
FanChen0116/19100_chat_50x_slot | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 580637
num_examples: 3200
- name: validation
num_bytes: 5405
num_examples: 32
- name: test
num_bytes: 646729
num_examples: 3731
download_size: 0
dataset_size: 1232771
---
# Dataset Card for "19100_chat_50x_slot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
veerav96/sokoto_coventry_normalized | ---
license: apache-2.0
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_14_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 908
num_examples: 32
download_size: 2074
dataset_size: 908
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_14_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AR2021/cybersecurity-corpus-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 33142
num_examples: 789
download_size: 14328
dataset_size: 33142
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cybersecurity-corpus-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aminlouhichi/donutTOPSOLIDTIMCOD2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 9934958.0
num_examples: 46
- name: validation
num_bytes: 9934958.0
num_examples: 46
- name: test
num_bytes: 9934958.0
num_examples: 46
download_size: 27390966
dataset_size: 29804874.0
---
# Dataset Card for "donutTOPSOLIDTIMCOD2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nicolas-BZRD/INCA_opendata | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2816739990
num_examples: 373751
download_size: 1125426154
dataset_size: 2816739990
license: odc-by
language:
- fr
tags:
- legal
size_categories:
- 100K<n<1M
---
# INCA
[Texts of unpublished judgments](https://echanges.dila.gouv.fr/OPENDATA/INCA/) (not published in the Bulletin) distributed by the Court of Cassation's competition fund since 1989.
In accordance with the CNIL recommendation of 29 November 2001, personal data concerning individuals (parties and witnesses) is pseudonymised. |
Neel-Gupta/minipile-processed_1024 | ---
dataset_info:
features:
- name: text
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 16663470768
num_examples: 1323
- name: test
num_bytes: 125952160
num_examples: 10
download_size: 1643767974
dataset_size: 16789422928
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/azusagawa_kaede_seishunbutayarou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Azusagawa Kaede
This is the dataset of Azusagawa Kaede, containing 177 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 177 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 411 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 177 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 177 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 177 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 177 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 177 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 411 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 411 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 411 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
sanchit-gandhi/common_voice_16_1_hi_pseudo_labelled | ---
dataset_info:
config_name: hi
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: condition_on_prev
sequence: int64
- name: whisper_transcript
dtype: string
splits:
- name: train
num_bytes: 634978222.0
num_examples: 717
- name: validation
num_bytes: 359634707.0
num_examples: 405
- name: test
num_bytes: 501454543.0
num_examples: 575
download_size: 1400598630
dataset_size: 1496067472.0
configs:
- config_name: hi
data_files:
- split: train
path: hi/train-*
- split: validation
path: hi/validation-*
- split: test
path: hi/test-*
---
# Common Voice 16.1 Hindi Pseudo-Labelled
This is the [Common Voice 16.1](https://huggingface.co/datasets/mozilla-foundation/common_voice_16_1) Hindi split pseudo-labelled using the [Whisper large-v3](https://huggingface.co/openai/whisper-large-v3) model, according to the instructions detailed
in the Distil-Whisper repository. To reproduce this pseudo-labelling run, follow the instructions detailed [here](https://github.com/huggingface/distil-whisper/tree/main/training#1-pseudo-labelling).
|
h2oai/h2o-translated-chinese-med-prompts | ---
license: apache-2.0
---
# Translated Chinese Medical Prompts
This repository contains medical prompts translated originally from Chinese, which can be used as training data for natural language processing (NLP) tasks related to the medical domain in English language.
Dataset Description
The dataset consists of a collection of medical prompts originally in Chinese, which have been translated into English. These prompts cover various medical topics, including symptoms, diagnoses, treatments, medications, and general healthcare information. Each prompt is paired with its corresponding English translation.
The dataset can be useful for training and evaluating machine learning models for tasks such as med chatbot, named entity recognition (NER), information retrieval, and other NLP applications in the medical domain.
## Dataset Format
The dataset is provided in a comma-separated values (CSV) file format with the following columns:
- Prompt: The translated medical prompt in the English language.
- Answer: The corresponding response to the prompt
## Usage
Researchers and developers interested in using the dataset can clone this repository and access the dataset file translated-chinese-med-prompts. The dataset can be loaded and processed using common data manipulation libraries or frameworks such as pandas in Python.
## Contribution
Contributions to this repository, such as adding more translated Chinese medical prompts or improving the dataset format, are welcome. Please create a pull request with your proposed changes.
## License
The dataset is provided under the apache-2.0 License, which allows for unrestricted commercial and non-commercial use.
Please note that while efforts have been made to ensure the accuracy of the translations, there may still be potential errors or inaccuracies. Users of the dataset are encouraged to review and verify the translations as needed. |
pyakymenko/test_dev | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 57651.0
num_examples: 2
download_size: 51674
dataset_size: 57651.0
---
# Dataset Card for "test_dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TopicNet/WikiRef-220 | ---
language:
- en
multilinguality:
- monolingual
license: other
license_name: topicnet
license_link: >-
https://github.com/machine-intelligence-laboratory/TopicNet/blob/master/LICENSE.txt
configs:
- config_name: "bag-of-words"
default: true
data_files:
- split: train
path: "data/wiki_ref220_bow.csv.gz"
- config_name: "natural-order-of-words"
data_files:
- split: train
path: "data/wiki_ref220_natural_order.csv.gz"
task_categories:
- text-classification
task_ids:
- topic-classification
- multi-class-classification
- multi-label-classification
tags:
- topic-modeling
- topic-modelling
- text-clustering
- multimodal-data
- multimodal-learning
- modalities
- document-representation
---
# WikiRef220
## References
1. Gialampoukidis, I., Vrochidis, S., & Kompatsiaris, I. (2016). A Hybrid Framework for News Clustering Based on the DBSCAN-Martingale and LDA. In Machine Learning and Data Mining in Pattern Recognition (pp. 170-184). Springer International Publishing.
|
version-control/ds-lib-version-2-normalized | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: version
list:
- name: pyproject.toml
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: requirements.txt
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: setup.py
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: hexsha
sequence: string
- name: normalized_version
list:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
splits:
- name: train
num_bytes: 3129289
num_examples: 10000
download_size: 895442
dataset_size: 3129289
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-us_foreign_policy-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 58872
num_examples: 100
download_size: 36510
dataset_size: 58872
---
# Dataset Card for "mmlu-us_foreign_policy-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AravindVadlapudi02/UA_speech_very-low | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': control
'1': pathology
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 766344936
num_examples: 798
- name: test
num_bytes: 4599029948
num_examples: 4789
download_size: 619863392
dataset_size: 5365374884
---
# Dataset Card for "UA_speech_very-low"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/bus_few4_50x_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 431503
num_examples: 1750
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 54596
dataset_size: 509021
---
# Dataset Card for "bus_few4_50x_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713083628 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 23475
num_examples: 54
download_size: 12351
dataset_size: 23475
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
unigram/fol-00 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: proof
dtype: string
- name: premise_tptp
dtype: string
- name: hypothesis_tptp
dtype: string
- name: deberta_pred
dtype: string
splits:
- name: train
num_bytes: 560956746.1151958
num_examples: 96427
- name: validation
num_bytes: 70123229.15441287
num_examples: 12054
- name: test
num_bytes: 70117411.73039143
num_examples: 12053
download_size: 129740674
dataset_size: 701197387.0
---
# Dataset Card for "fol-00"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1 | ---
pretty_name: Evaluation run of adamo1139/yi-34b-200k-rawrr-dpo-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adamo1139/yi-34b-200k-rawrr-dpo-1](https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T12:24:51.812406](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1/blob/main/results_2024-01-16T12-24-51.812406.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7557329155625617,\n\
\ \"acc_stderr\": 0.02836045891045506,\n \"acc_norm\": 0.7606955903500686,\n\
\ \"acc_norm_stderr\": 0.02889015510293627,\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5399803687437482,\n\
\ \"mc2_stderr\": 0.014956918567738575\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131172,\n\
\ \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145675\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6570404301931886,\n\
\ \"acc_stderr\": 0.00473727969103619,\n \"acc_norm\": 0.8569010157339175,\n\
\ \"acc_norm_stderr\": 0.0034945810763985265\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n\
\ \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.02350873921884694,\n\
\ \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.02350873921884694\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n\
\ \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \
\ \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n\
\ \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n\
\ \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.02750175294441242,\n\
\ \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.02750175294441242\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6402116402116402,\n \"acc_stderr\": 0.024718075944129277,\n \"\
acc_norm\": 0.6402116402116402,\n \"acc_norm_stderr\": 0.024718075944129277\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n\
\ \"acc_stderr\": 0.043902592653775635,\n \"acc_norm\": 0.5952380952380952,\n\
\ \"acc_norm_stderr\": 0.043902592653775635\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.896774193548387,\n \"acc_stderr\": 0.01730838128103453,\n \"acc_norm\"\
: 0.896774193548387,\n \"acc_norm_stderr\": 0.01730838128103453\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6699507389162561,\n\
\ \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.6699507389162561,\n\
\ \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781675,\n \
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781675\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909039,\n\
\ \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909039\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.019776601086550036,\n\
\ \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.019776601086550036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4074074074074074,\n \"acc_stderr\": 0.029958249250082114,\n \
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.029958249250082114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8319327731092437,\n \"acc_stderr\": 0.02428910211569226,\n \
\ \"acc_norm\": 0.8319327731092437,\n \"acc_norm_stderr\": 0.02428910211569226\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"\
acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769574,\n \"\
acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.02521232721050711,\n\
\ \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.02521232721050711\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n\
\ \"acc_stderr\": 0.010648356301876341,\n \"acc_norm\": 0.9016602809706258,\n\
\ \"acc_norm_stderr\": 0.010648356301876341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6636871508379888,\n\
\ \"acc_stderr\": 0.015801003729145887,\n \"acc_norm\": 0.6636871508379888,\n\
\ \"acc_norm_stderr\": 0.015801003729145887\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8660130718954249,\n \"acc_stderr\": 0.019504890618464815,\n\
\ \"acc_norm\": 0.8660130718954249,\n \"acc_norm_stderr\": 0.019504890618464815\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n\
\ \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n\
\ \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.01887735383957185,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.01887735383957185\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6063829787234043,\n \"acc_stderr\": 0.029144544781596154,\n \
\ \"acc_norm\": 0.6063829787234043,\n \"acc_norm_stderr\": 0.029144544781596154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5971316818774446,\n\
\ \"acc_stderr\": 0.012526955577118012,\n \"acc_norm\": 0.5971316818774446,\n\
\ \"acc_norm_stderr\": 0.012526955577118012\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n\
\ \"acc_stderr\": 0.0206871869515341,\n \"acc_norm\": 0.9054726368159204,\n\
\ \"acc_norm_stderr\": 0.0206871869515341\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5399803687437482,\n\
\ \"mc2_stderr\": 0.014956918567738575\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247008\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6178923426838514,\n \
\ \"acc_stderr\": 0.013384173935648492\n }\n}\n```"
repo_url: https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|arc:challenge|25_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|gsm8k|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hellaswag|10_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T12-24-51.812406.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- '**/details_harness|winogrande|5_2024-01-16T12-24-51.812406.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T12-24-51.812406.parquet'
- config_name: results
data_files:
- split: 2024_01_16T12_24_51.812406
path:
- results_2024-01-16T12-24-51.812406.parquet
- split: latest
path:
- results_2024-01-16T12-24-51.812406.parquet
---
# Dataset Card for Evaluation run of adamo1139/yi-34b-200k-rawrr-dpo-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/yi-34b-200k-rawrr-dpo-1](https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T12:24:51.812406](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1/blob/main/results_2024-01-16T12-24-51.812406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7557329155625617,
"acc_stderr": 0.02836045891045506,
"acc_norm": 0.7606955903500686,
"acc_norm_stderr": 0.02889015510293627,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5399803687437482,
"mc2_stderr": 0.014956918567738575
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131172,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145675
},
"harness|hellaswag|10": {
"acc": 0.6570404301931886,
"acc_stderr": 0.00473727969103619,
"acc_norm": 0.8569010157339175,
"acc_norm_stderr": 0.0034945810763985265
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.02350873921884694,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.02350873921884694
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7702127659574468,
"acc_stderr": 0.02750175294441242,
"acc_norm": 0.7702127659574468,
"acc_norm_stderr": 0.02750175294441242
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6402116402116402,
"acc_stderr": 0.024718075944129277,
"acc_norm": 0.6402116402116402,
"acc_norm_stderr": 0.024718075944129277
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.043902592653775635,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.043902592653775635
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103453,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103453
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6699507389162561,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.6699507389162561,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781675,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781675
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909039,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909039
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.019776601086550036,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.019776601086550036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.029958249250082114,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.029958249250082114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8319327731092437,
"acc_stderr": 0.02428910211569226,
"acc_norm": 0.8319327731092437,
"acc_norm_stderr": 0.02428910211569226
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769574,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8834355828220859,
"acc_stderr": 0.02521232721050711,
"acc_norm": 0.8834355828220859,
"acc_norm_stderr": 0.02521232721050711
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876341,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657528,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657528
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6636871508379888,
"acc_stderr": 0.015801003729145887,
"acc_norm": 0.6636871508379888,
"acc_norm_stderr": 0.015801003729145887
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8660130718954249,
"acc_stderr": 0.019504890618464815,
"acc_norm": 0.8660130718954249,
"acc_norm_stderr": 0.019504890618464815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.021029576464662695,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.021029576464662695
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.01887735383957185,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.01887735383957185
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6063829787234043,
"acc_stderr": 0.029144544781596154,
"acc_norm": 0.6063829787234043,
"acc_norm_stderr": 0.029144544781596154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5971316818774446,
"acc_stderr": 0.012526955577118012,
"acc_norm": 0.5971316818774446,
"acc_norm_stderr": 0.012526955577118012
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.0206871869515341,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.0206871869515341
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5399803687437482,
"mc2_stderr": 0.014956918567738575
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247008
},
"harness|gsm8k|5": {
"acc": 0.6178923426838514,
"acc_stderr": 0.013384173935648492
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jzhuolin/112211 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_mlabonne__NeuralDaredevil-7B | ---
pretty_name: Evaluation run of mlabonne/NeuralDaredevil-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/NeuralDaredevil-7B](https://huggingface.co/mlabonne/NeuralDaredevil-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralDaredevil-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T01:21:38.357937](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralDaredevil-7B/blob/main/results_2024-01-16T01-21-38.357937.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6564539739507499,\n\
\ \"acc_stderr\": 0.032046532668970576,\n \"acc_norm\": 0.6558352422789521,\n\
\ \"acc_norm_stderr\": 0.03271651117623881,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6685490621837052,\n\
\ \"mc2_stderr\": 0.014954458772938018\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729124,\n\
\ \"acc_norm\": 0.6988054607508533,\n \"acc_norm_stderr\": 0.013406741767847638\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6970722963553077,\n\
\ \"acc_stderr\": 0.004585850835623566,\n \"acc_norm\": 0.8762198765186218,\n\
\ \"acc_norm_stderr\": 0.003286574812451194\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.026880647889051975,\n\
\ \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.026880647889051975\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700486,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700486\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.016553287863116033,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.016553287863116033\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6685490621837052,\n\
\ \"mc2_stderr\": 0.014954458772938018\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047986\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.731614859742229,\n \
\ \"acc_stderr\": 0.012205702688013673\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/NeuralDaredevil-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|arc:challenge|25_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|gsm8k|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hellaswag|10_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-21-38.357937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T01-21-38.357937.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- '**/details_harness|winogrande|5_2024-01-16T01-21-38.357937.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T01-21-38.357937.parquet'
- config_name: results
data_files:
- split: 2024_01_16T01_21_38.357937
path:
- results_2024-01-16T01-21-38.357937.parquet
- split: latest
path:
- results_2024-01-16T01-21-38.357937.parquet
---
# Dataset Card for Evaluation run of mlabonne/NeuralDaredevil-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralDaredevil-7B](https://huggingface.co/mlabonne/NeuralDaredevil-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralDaredevil-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T01:21:38.357937](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralDaredevil-7B/blob/main/results_2024-01-16T01-21-38.357937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6564539739507499,
"acc_stderr": 0.032046532668970576,
"acc_norm": 0.6558352422789521,
"acc_norm_stderr": 0.03271651117623881,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6685490621837052,
"mc2_stderr": 0.014954458772938018
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729124,
"acc_norm": 0.6988054607508533,
"acc_norm_stderr": 0.013406741767847638
},
"harness|hellaswag|10": {
"acc": 0.6970722963553077,
"acc_stderr": 0.004585850835623566,
"acc_norm": 0.8762198765186218,
"acc_norm_stderr": 0.003286574812451194
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.026880647889051975,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.026880647889051975
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700486,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700486
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.016553287863116033,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.016553287863116033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532069,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6685490621837052,
"mc2_stderr": 0.014954458772938018
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047986
},
"harness|gsm8k|5": {
"acc": 0.731614859742229,
"acc_stderr": 0.012205702688013673
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joelniklaus/german_argument_mining | ---
annotations_creators:
- expert-generated
- found
language_creators:
- found
language:
- de
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Annotated German Legal Decision Corpus
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# Dataset Card for Annotated German Legal Decision Corpus
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://zenodo.org/record/3936490#.X1ed7ovgomK
- **Paper:** Urchs., S., Mitrović., J., & Granitzer., M. (2021). Design and Implementation of German Legal Decision
Corpora. Proceedings of the 13th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,
515–521. https://doi.org/10.5220/0010187305150521
- **Leaderboard:**
- **Point of Contact:** [Joel Niklaus](mailto:joel.niklaus.2@bfh.ch)
### Dataset Summary
This dataset consists of 200 randomly chosen judgments. In these judgments a legal expert annotated the components
conclusion, definition and subsumption of the German legal writing style Urteilsstil.
*"Overall 25,075 sentences are annotated. 5% (1,202) of these sentences are marked as conclusion, 21% (5,328) as
definition, 53% (13,322) are marked as subsumption and the remaining 21% (6,481) as other. The length of judgments in
sentences ranges from 38 to 862 sentences. The median of judgments have 97 sentences, the length of most judgments is on
the shorter side."* (Urchs. et al., 2021)
*"Judgments from 22 of the 131 courts are selected for the corpus. Most judgments originate from the VG Augsburg (59 /
30%) followed by the VG Ansbach (39 / 20%) and LSG Munich (33 / 17%)."* (Urchs. et al., 2021)
*"29% (58) of all selected judgments are issued in the year 2016, followed by 22% (44) from the year 2017 and 21% (41)
issued in the year 2015. [...] The percentages of selected judgments and decisions issued in 2018 and 2019 are roughly
the same. No judgments from 2020 are selected."* (Urchs. et al., 2021)
### Supported Tasks and Leaderboards
The dataset can be used for multi-class text classification tasks, more specifically, for argument mining.
### Languages
The language in the dataset is German as it is used in Bavarian courts in Germany.
## Dataset Structure
### Data Instances
Each sentence is saved as a json object on a line in one of the three files `train.jsonl`, `validation.jsonl`
or `test.jsonl`. The file `meta.jsonl` contains meta information for each court. The `file_number` is present in all
files for identification. Each sentence of the court decision was categorized according to its function.
### Data Fields
The file `meta.jsonl` contains for each row the following fields:
- `meta_title`: Title provided by the website, it is used for saving the decision
- `court`: Issuing court
- `decision_style`: Style of the decision; the corpus contains either *Urteil* (='judgment') or *Endurteil* (
='end-judgment')
- `date`: Date when the decision was issued by the court
- `file_number`: Identification number used for this decision by the court
- `title`: Title provided by the court
- `norm_chains`: Norms related to the decision
- `decision_guidelines`: Short summary of the decision
- `keywords`: Keywords associated with the decision
- `lower_court`: Court that decided on the decision before
- `additional_information`: Additional Information
- `decision_reference`: References to the location of the decision in beck-online
- `tenor`: Designation of the legal consequence ordered by the court (list of paragraphs)
- `legal_facts`: Facts that form the base for the decision (list of paragraphs)
The files `train.jsonl`, `validation.jsonl` and `test.jsonl` contain the following fields:
- `file_number`: Identification number for linkage with the file `meta.jsonl`
- `input_sentence`: The sentence to be classified
- `label`: In depth explanation of the court decision. Each sentence is assigned to one of the major components of
German *Urteilsstil* (Urchs. et al., 2021) (list of paragraphs, each paragraph containing list of sentences, each
sentence annotated with one of the following four labels):
- `conclusion`: Overall result
- `definition`: Abstract legal facts and consequences
- `subsumption`: Determination sentence / Concrete facts
- `other`: Anything else
- `context_before`: Context in the same paragraph before the input_sentence
- `context_after`: Context in the same paragraph after the input_sentence
### Data Splits
No split provided in the original release.
Splits created by Joel Niklaus. We randomly split the dataset into 80% (160 decisions, 19271 sentences) train, 10%
validation (20 decisions, 2726 sentences) and 10% test (20 decisions, 3078 sentences). We made sure, that a decision
only occurs in one split and is not dispersed over multiple splits.
Label Distribution
| label | train | validation | test |
|:---------------|-----------:|-------------:|----------:|
| conclusion | 975 | 115 | 112 |
| definition | 4105 | 614 | 609 |
| subsumption | 10034 | 1486 | 1802 |
| other | 4157 | 511 | 555 |
| total | **19271** | **2726** | **3078** |
## Dataset Creation
### Curation Rationale
Creating a publicly available German legal text corpus consisting of judgments that have been annotated by a legal
expert. The annotated components consist of *conclusion*, *definition* and *subsumption* of the German legal writing
style *Urteilsstil*.
### Source Data
#### Initial Data Collection and Normalization
*“The decision corpus is a collection of the decisions published on the website www.gesetze-bayern.de. At the time of
the crawling the website offered 32,748 decisions of 131 Bavarian courts, dating back to 2015. The decisions are
provided from the Bavarian state after the courts agreed to a publication. All decisions are processed by the publisher
C.H.BECK, commissioned by the Bavarian state. This processing includes anonymisation, key-wording, and adding of
editorial guidelines to the decisions.”* (Urchs. et al., 2021)
#### Who are the source language producers?
German courts from Bavaria
### Annotations
#### Annotation process
*“As stated above, the judgment corpus consist of 200 randomly chosen judgments that are annotated by a legal expert,
who holds a first legal state exam. Due to financial, staff and time reasons the presented iteration of the corpus was
only annotated by a single expert. In a future version several other experts will annotate the corpus and the
inter-annotator agreement will be calculated.”* (Urchs. et al., 2021)
#### Who are the annotators?
A legal expert, who holds a first legal state exam.
### Personal and Sensitive Information
*"All decisions are processed by the publisher C.H.BECK, commissioned by the Bavarian state. This processing includes **
anonymisation**, key-wording, and adding of editorial guidelines to the decisions.”* (Urchs. et al., 2021)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
The SoMaJo Sentence Splitter has been used. Upon manual inspection of the dataset, we could see that the sentence
splitter had poor accuracy in some cases (see ```analyze_dataset()``` in ```convert_to_hf_dataset.py```). When creating
the splits, we thought about merging small sentences with their neighbors or removing them all together. However, since
we could not find an straightforward way to do this, we decided to leave the dataset content untouched.
Note that the information given in this dataset card refer to the dataset version as provided by Joel Niklaus and Veton
Matoshi. The dataset at hand is intended to be part of a bigger benchmark dataset. Creating a benchmark dataset
consisting of several other datasets from different sources requires postprocessing. Therefore, the structure of the
dataset at hand, including the folder structure, may differ considerably from the original dataset. In addition to that,
differences with regard to dataset statistics as give in the respective papers can be expected. The reader is advised to
have a look at the conversion script ```convert_to_hf_dataset.py``` in order to retrace the steps for converting the
original dataset into the present jsonl-format. For further information on the original dataset structure, we refer to
the bibliographical references and the original Github repositories and/or web pages provided in this dataset card.
## Additional Information
### Dataset Curators
The names of the original dataset curators and creators can be found in references given below, in the section *Citation
Information*. Additional changes were made by Joel Niklaus ([Email](mailto:joel.niklaus.2@bfh.ch)
; [Github](https://github.com/joelniklaus)) and Veton Matoshi ([Email](mailto:veton.matoshi@bfh.ch)
; [Github](https://github.com/kapllan)).
### Licensing Information
[Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/legalcode)
### Citation Information
```
@dataset{urchs_stefanie_2020_3936490,
author = {Urchs, Stefanie and
Mitrović, Jelena},
title = {{German legal jugements annotated with judement
style components}},
month = jul,
year = 2020,
publisher = {Zenodo},
doi = {10.5281/zenodo.3936490},
url = {https://doi.org/10.5281/zenodo.3936490}
}
```
```
@conference{icaart21,
author = {Urchs., Stefanie and Mitrovi{\'{c}}., Jelena and Granitzer., Michael},
booktitle = {Proceedings of the 13th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,},
doi = {10.5220/0010187305150521},
isbn = {978-989-758-484-8},
issn = {2184-433X},
organization = {INSTICC},
pages = {515--521},
publisher = {SciTePress},
title = {{Design and Implementation of German Legal Decision Corpora}},
year = {2021}
}
```
### Contributions
Thanks to [@kapllan](https://github.com/kapllan) and [@joelniklaus](https://github.com/joelniklaus) for adding this
dataset.
|
mari-lab/mari-monolingual-corpus | ---
language:
- mhr
---
A monolingual corpus of the Mari language in various genres, containing over 20 million word occurrences.
The presented genres:
|Genre|Russian|English|
|:-----|:-------|:-------|
|мутер|словарь|dictionary|
|газетысе увер|газетные новости|periodical news|
|прозо|проза|prose|
|фольклор|фольклор|folklore|
|публицистике|публицистика|publicistic literature|
|поэзий|поэзия|poetry|
|трагикомедий|трагикомедия|tragicomedy|
|пьесе|пьеса|play|
|драме|драма|drama|
|комедий-водевиль|водевиль|vaudeville|
|комедий|комедия|comedy|
|йоча ойлымаш-влак|детские рассказы|сhildren's stories|
|сценарий|сценарий|scenario|
|йомак|сказка|fairytale|
|Библий|Библия|Bible|
|повесть|повесть|novel|
|ойлымаш-влак|рассказы|stories|
|ойлымаш ден йомак-влак|рассказы и сказки|stories and fairytales|
|автор нерген|об авторе|about the author|
```
@inproceedings{
title={Mari monolingual corpus},
author={Andrei Chemyshev, Gennadii Sabantsev, Nadezhda Timofeeva, Vasilii Semenov},
year={2023}
}
``` |
abdur75648/UTRSet-Real | ---
title: UrduSet-Real (UTRNet)
emoji: 📖
colorFrom: red
colorTo: green
license: cc-by-nc-4.0
task_categories:
- image-to-text
language:
- ur
tags:
- ocr
- text recognition
- urdu-ocr
- utrnet
pretty_name: UTRSet-Real
references:
- https://github.com/abdur75648/UTRNet-High-Resolution-Urdu-Text-Recognition
- https://abdur75648.github.io/UTRNet/
- https://arxiv.org/abs/2306.15782
---
The **UTRSet-Real** dataset is a comprehensive, manually annotated dataset specifically curated for **Printed Urdu OCR** research. It contains over **11,000** printed text line images, each of which has been meticulously annotated. One of the standout features of this dataset is its remarkable diversity, which includes variations in fonts, text sizes, colours, orientations, lighting conditions, noises, styles, and backgrounds. This diversity closely mirrors real-world scenarios, making the dataset highly suitable for training and evaluating models that aim to excel in real-world Urdu text recognition tasks.
The availability of the UTRSet-Real dataset addresses the scarcity of comprehensive real-world printed Urdu OCR datasets. By providing researchers with a valuable resource for developing and benchmarking Urdu OCR models, this dataset promotes standardized evaluation and reproducibility and fosters advancements in the field of Urdu OCR. Further, to complement the UTRSet-Real for training purposes, we also present [**UTRSet-Synth**](https://paperswithcode.com/dataset/utrset-synth), a high-quality synthetic dataset closely resembling real-world representations of Urdu text. For more information and details about the [UTRSet-Real](https://paperswithcode.com/dataset/utrset-real) & [UTRSet-Synth](https://paperswithcode.com/dataset/utrset-synth) datasets, please refer to the paper ["UTRNet: High-Resolution Urdu Text Recognition In Printed Documents"](https://arxiv.org/abs/2306.15782) |
tanvirsrbd1/vary_merged_dataset1 | ---
dataset_info:
features:
- name: html
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 3337766
num_examples: 5960
download_size: 1093625
dataset_size: 3337766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vary_merged_dataset1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-108000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1004845
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iamnguyen/edu_child_01 | ---
dataset_info:
features:
- name: content
dtype: string
- name: metadata
struct:
- name: metadata
struct:
- name: answer
dtype: string
- name: id
dtype: string
- name: prefix
dtype: string
- name: question
dtype: string
- name: school_id
dtype: string
- name: seq_num
dtype: int64
- name: source
dtype: string
- name: tokenized_question
dtype: string
- name: url
dtype: string
- name: vector
sequence: float64
- name: vector
sequence: float64
splits:
- name: train
num_bytes: 18574718
num_examples: 1015
download_size: 12148966
dataset_size: 18574718
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nateraw/misc | ---
license: mit
---
|
rkf2778/amazon_reviews_mobile_electronics | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- reviews
size_categories:
- 10K<n<100K
--- |
heliosprime/twitter_dataset_1713016355 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13048
num_examples: 29
download_size: 10112
dataset_size: 13048
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713016355"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-college_chemistry-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
- name: neg_prompt
dtype: string
splits:
- name: dev
num_bytes: 7604
num_examples: 5
- name: test
num_bytes: 809370
num_examples: 100
download_size: 138526
dataset_size: 816974
---
# Dataset Card for "mmlu-college_chemistry-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Joe02/kuroishimebo_refs | ---
license: other
---
|
JuanKO/custom_tokenizer_littlestories | ---
license: openrail
dataset_info:
features:
- name: token
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 28247902
num_examples: 3443
download_size: 22867616
dataset_size: 28247902
---
|
liuyanchen1015/MULTI_VALUE_rte_existential_got | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 45160
num_examples: 112
- name: train
num_bytes: 53000
num_examples: 115
download_size: 72445
dataset_size: 98160
---
# Dataset Card for "MULTI_VALUE_rte_existential_got"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yashnbx/indic-gretil-dump | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: level
dtype: string
- name: url
dtype: string
- name: f_level
dtype: string
- name: name
dtype: string
- name: people
dtype: string
- name: gpt-descriptions
dtype: string
- name: page_size
dtype: float64
- name: page_content_type
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 618940758
num_examples: 1035
download_size: 254899614
dataset_size: 618940758
---
# Dataset Card for "indic-gretil-dump"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NLPC-UOM/Sinhala-Neuspellcorrector | ---
language:
- si
license:
- mit
---
This repository contains the dataset for paper "A Neural Spell Corrector and a baseline for Sinhala SpellCorrection" |
natmin322/18k_vietnamese_voice_augmented_of_VigBigData | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2227145408.0
num_examples: 10000
- name: validation
num_bytes: 1416405046.0
num_examples: 5000
- name: test
num_bytes: 886300388.18
num_examples: 3005
download_size: 4798437831
dataset_size: 4529850842.18
---
# Dataset Card for "18k_vietnamese_voice_augmented_of_VigBigData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/takarada_rikka | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Takarada Rikka/宝多六花
This is the dataset of Takarada Rikka/宝多六花 , containing 500 images and their tags.
The core tags of this character are `black_hair, long_hair, blue_eyes, bangs, bow, red_bow, scrunchie, orange_scrunchie, wrist_scrunchie, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 590.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takarada_rikka/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 337.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takarada_rikka/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1231 | 731.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takarada_rikka/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 527.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takarada_rikka/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1231 | 1011.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takarada_rikka/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/takarada_rikka',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, black_skirt, long_sleeves, looking_at_viewer, pleated_skirt, school_uniform, solo, white_cardigan, white_shirt, blush, red_bowtie, closed_mouth, collared_shirt, white_background, thighs, simple_background, sitting, miniskirt |
| 1 | 11 |  |  |  |  |  | 1girl, school_uniform, solo, upper_body, closed_mouth, looking_at_viewer, simple_background, red_bowtie, white_background, white_cardigan, white_shirt, collared_shirt, long_sleeves |
| 2 | 5 |  |  |  |  |  | 1girl, black_skirt, blush, long_sleeves, looking_at_viewer, parted_lips, pleated_skirt, red_bowtie, school_uniform, solo, thighs, white_cardigan, white_shirt, collared_shirt, earphones, holding, miniskirt, medium_breasts, on_back, bed_sheet, on_side |
| 3 | 8 |  |  |  |  |  | 1girl, black_skirt, from_behind, long_sleeves, looking_at_viewer, looking_back, pleated_skirt, school_uniform, solo, thighs, ass, blush, simple_background, white_background, panties, white_cardigan, microskirt, miniskirt |
| 4 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, thighs, thigh_strap, off_shoulder, bare_shoulders, blush, medium_breasts, ponytail, black_gloves, cleavage, midriff, simple_background, white_background, miniskirt, open_jacket, stomach, black_skirt, blue_jacket, crop_top, long_sleeves, short_shorts |
| 5 | 12 |  |  |  |  |  | 1girl, black_bikini, cleavage, layered_bikini, looking_at_viewer, side-tie_bikini_bottom, solo, hair_scrunchie, medium_breasts, ponytail, tiger_print, navel, collarbone, cowboy_shot, simple_background, white_background, blush, mismatched_bikini, standing |
| 6 | 6 |  |  |  |  |  | 1girl, christmas, santa_costume, santa_hat, solo, fur_trim, looking_at_viewer, red_dress, red_headwear, bare_shoulders, holding, smile, belt, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | long_sleeves | looking_at_viewer | pleated_skirt | school_uniform | solo | white_cardigan | white_shirt | blush | red_bowtie | closed_mouth | collared_shirt | white_background | thighs | simple_background | sitting | miniskirt | upper_body | parted_lips | earphones | holding | medium_breasts | on_back | bed_sheet | on_side | from_behind | looking_back | ass | panties | microskirt | navel | thigh_strap | off_shoulder | bare_shoulders | ponytail | black_gloves | cleavage | midriff | open_jacket | stomach | blue_jacket | crop_top | short_shorts | black_bikini | layered_bikini | side-tie_bikini_bottom | hair_scrunchie | tiger_print | collarbone | cowboy_shot | mismatched_bikini | standing | christmas | santa_costume | santa_hat | fur_trim | red_dress | red_headwear | smile | belt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:---------------|:--------------------|:----------------|:-----------------|:-------|:-----------------|:--------------|:--------|:-------------|:---------------|:-----------------|:-------------------|:---------|:--------------------|:----------|:------------|:-------------|:--------------|:------------|:----------|:-----------------|:----------|:------------|:----------|:--------------|:---------------|:------|:----------|:-------------|:--------|:--------------|:---------------|:-----------------|:-----------|:---------------|:-----------|:----------|:--------------|:----------|:--------------|:-----------|:---------------|:---------------|:-----------------|:-------------------------|:-----------------|:--------------|:-------------|:--------------|:--------------------|:-----------|:------------|:----------------|:------------|:-----------|:------------|:---------------|:--------|:-------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | X | X | | X | X | X | X | | X | X | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | X | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | | X | X | X | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | X | X | | | X | | | X | | | | X | X | X | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | | X | | | X | | | X | | | | X | | X | | | | | | | X | | | | | | | | | X | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
issai/kazparc | ---
task_categories:
- translation
language:
- kk
- en
- ru
- tr
pretty_name: Kazakh Parallel Corpus
dataset_info:
- config_name: kazparc_raw
features:
- name: id
dtype: string
- name: kk
dtype: string
- name: en
dtype: string
- name: ru
dtype: string
- name: tr
dtype: string
- name: domain
dtype: string
splits:
- name: train
num_bytes: 230957871
num_examples: 371902
- config_name: kazparc
features:
- name: id
dtype: string
- name: source_lang
dtype: string
- name: target_lang
dtype: string
- name: domain
dtype: string
- name: pair
dtype: string
splits:
- name: train
num_bytes: 584249013
num_examples: 1742956
- name: validation
num_bytes: 145898177
num_examples: 435742
- name: test
num_bytes: 8936796
num_examples: 28500
- config_name: sync_raw
features:
- name: id
dtype: string
- name: kk
dtype: string
- name: en
dtype: string
- name: ru
dtype: string
- name: tr
dtype: string
- name: domain
dtype: string
splits:
- name: train
num_bytes: 1278185141
num_examples: 1797066
- config_name: sync
features:
- name: id
dtype: string
- name: source_lang
dtype: string
- name: target_lang
dtype: string
- name: domain
dtype: string
- name: pair
dtype: string
splits:
- name: train
num_bytes: 3654616080
num_examples: 9654322
- name: validation
num_bytes: 405929897
num_examples: 1072705
configs:
- config_name: kazparc_raw
data_files:
- split: train
path: kazparc/01_kazparc_all_entries.csv
default: true
- config_name: kazparc
data_files:
- split: train
path:
- kazparc/02_kazparc_train_kk_en.csv
- kazparc/03_kazparc_train_kk_ru.csv
- kazparc/04_kazparc_train_kk_tr.csv
- kazparc/05_kazparc_train_en_ru.csv
- kazparc/06_kazparc_train_en_tr.csv
- kazparc/07_kazparc_train_ru_tr.csv
- split: validation
path:
- kazparc/08_kazparc_valid_kk_en.csv
- kazparc/09_kazparc_valid_kk_ru.csv
- kazparc/10_kazparc_valid_kk_tr.csv
- kazparc/11_kazparc_valid_en_ru.csv
- kazparc/12_kazparc_valid_en_tr.csv
- kazparc/13_kazparc_valid_ru_tr.csv
- split: test
path:
- kazparc/14_kazparc_test_kk_en.csv
- kazparc/15_kazparc_test_kk_ru.csv
- kazparc/16_kazparc_test_kk_tr.csv
- kazparc/17_kazparc_test_en_ru.csv
- kazparc/18_kazparc_test_en_tr.csv
- kazparc/19_kazparc_test_ru_tr.csv
- config_name: sync_raw
data_files:
- split: train
path: sync/20_sync_all_entries.csv
- config_name: sync
data_files:
- split: train
path:
- sync/21_sync_train_kk_en.csv
- sync/22_sync_train_kk_ru.csv
- sync/23_sync_train_kk_tr.csv
- sync/24_sync_train_en_ru.csv
- sync/25_sync_train_en_tr.csv
- sync/26_sync_train_ru_tr.csv
- split: validation
path:
- sync/27_sync_valid_kk_en.csv
- sync/28_sync_valid_kk_ru.csv
- sync/29_sync_valid_kk_tr.csv
- sync/30_sync_valid_en_ru.csv
- sync/31_sync_valid_en_tr.csv
- sync/32_sync_valid_ru_tr.csv
size_categories:
- 100K<n<1M
---
## Dataset Description
- **Repository:** https://github.com/IS2AI/KazParC
- **Paper:** https://arxiv.org/abs/2403.19399
<h1 align = "center">KazParC</h1>
<p align = "justify">Kazakh Parallel Corpus (KazParC) is a parallel corpus designed for machine translation across Kazakh, English, Russian, and Turkish. The first and largest publicly available corpus of its kind, KazParC contains a collection of 372,164 parallel sentences covering different domains and developed with the assistance of human translators.
</p>
<a style="text-decoration:none" name = "sources_domains"><h2 align = "center">Data Sources and Domains</h2></a>
<p align = "justify">The data sources include</p>
<ul>
<li>proverbs and sayings</li>
<li>terminology glossaries</li>
<li>phrasebooks</li>
<li>literary works</li>
<li>periodicals</li>
<li>language learning materials, including the SCoRE corpus by <a href = "https://www.torrossa.com/en/resources/an/5000845#page=118">Chujo et al. (2015)</a></li>
<li>educational video subtitle collections, such as QED by <a href = "http://www.lrec-conf.org/proceedings/lrec2014/pdf/877_Paper.pdf">Abdelali et al. (2014)</a></li>
<li>news items, such as KazNERD (<a href = "https://aclanthology.org/2022.lrec-1.44.pdf">Yeshpanov et al., 2022</a>) and WMT (<a href = "http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper.pdf">Tiedemann, 2012</a>)</li>
<li><a href = "https://www.ted.com/">TED</a> talks</li>
<li><a href = "https://adilet.zan.kz/">governmental and regulatory legal documents from Kazakhstan</a></li>
<li>communications from the <a href = "https://www.akorda.kz/">official website of the President of the Republic of Kazakhstan</a></li>
<li><a href = "https://www.un.org/">United Nations</a> publications</li>
<li>image captions from sources like <a href = "https://arxiv.org/pdf/1405.0312.pdf%090.949.pdf">COCO</a></li>
</ul>
<p align = "justify">The sources are categorised into five broad domains:</p>
<table align = "center">
<thead>
<tr align = "center">
<th rowspan="3">Domain</th>
<th align = "right" colspan="2" rowspan="2">lines</th>
<th colspan="8">tokens</th>
</tr>
<tr align = "right">
<th colspan="2">EN</th>
<th colspan="2">KK</th>
<th colspan="2">RU</th>
<th colspan="2">TR</th>
</tr>
<tr align = "right">
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
</tr>
</thead>
<tbody align = "right">
<tr>
<td align = "center">Mass media</td>
<td>120,547</td>
<td>32.4</td>
<td>1,817,276</td>
<td>28.3</td>
<td>1,340,346</td>
<td>28.6</td>
<td>1,454,430</td>
<td>29.0</td>
<td>1,311,985</td>
<td>28.5</td>
</tr>
<tr>
<td align = "center">General</td>
<td>94,988</td>
<td>25.5</td>
<td>844,541</td>
<td>13.1</td>
<td>578,236</td>
<td>12.3</td>
<td>618,960</td>
<td>12.3</td>
<td>608,020</td>
<td>13.2</td>
</tr>
<tr>
<td align = "center">Legal documents</td>
<td>77,183</td>
<td>20.8</td>
<td>2,650,626</td>
<td>41.3</td>
<td>1,925,561</td>
<td>41.0</td>
<td>1,991,222</td>
<td>39.7</td>
<td>1,880,081</td>
<td>40.8</td>
</tr>
<tr>
<td align = "center">Education and science</td>
<td>46,252</td>
<td>12.4</td>
<td>522,830</td>
<td>8.1</td>
<td>392,348</td>
<td>8.4</td>
<td>444,786</td>
<td>8.9</td>
<td>376,484</td>
<td>8.2</td>
</tr>
<tr>
<td align = "center">Fiction</td>
<td>32,932</td>
<td>8.9</td>
<td>589,001</td>
<td>9.2</td>
<td>456,385</td>
<td>9.7</td>
<td>510,168</td>
<td>10.2</td>
<td>433,968</td>
<td>9.4</td>
</tr>
<tr>
<td align = "center"><b>Total</b></td>
<td><b>371,902</b></td>
<td><b>100</b></td>
<td><b>6,424,274</b></td>
<td><b>100</b></td>
<td><b>4,692,876</b></td>
<td><b>100</b></td>
<td><b>5,019,566</b></td>
<td><b>100</b></td>
<td><b>4,610,538</b></td>
<td><b>100</b></td>
</tr>
</tbody>
</table>
<table align = "center">
<thead align = "center">
<tr>
<th>Pair</th>
<th># lines</th>
<th># sents</th>
<th># tokens</th>
<th># types</th>
</tr>
</thead>
<tbody align = "center">
<tr>
<td>KK↔EN</td>
<td>363,594</td>
<td>362,230<br>361,087</td>
<td>4,670,789<br>6,393,381</td>
<td>184,258<br>59,062</td>
</tr>
<tr>
<td>KK↔RU</td>
<td>363,482</td>
<td>362,230<br>362,748</td>
<td>4,670,593<br>4,996,031</td>
<td>184,258<br>183,204</td>
</tr>
<tr>
<td>KK↔TR</td>
<td>362,150</td>
<td>362,230<br>361,660</td>
<td>4,668,852<br>4,586,421</td>
<td>184,258<br>175,145</td>
</tr>
<tr>
<td>EN↔RU</td>
<td>363,456</td>
<td>361,087<br>362,748</td>
<td>6,392,301<br>4,994,310</td>
<td>59,062<br>183,204</td>
</tr>
<tr>
<td>EN↔TR</td>
<td>362,392</td>
<td>361,087<br>361,660</td>
<td>6,380,703<br>4,579,375</td>
<td>59,062<br>175,145</td>
</tr>
<tr>
<td>RU↔TR</td>
<td>363,324</td>
<td>362,748<br>361,660</td>
<td>4,999,850<br>4,591,847</td>
<td>183,204<br>175,145</td>
</tr>
</tbody>
</table>
<h2 align = "center">Synthetic Corpus</h2>
<p align = "justify">To make our parallel corpus more extensive, we carried out web crawling to gather a total of 1,797,066 sentences from English-language websites. These sentences were then automatically translated into Kazakh, Russian, and Turkish using the <a href = "https://translate.google.com/">Google Translate</a> service. We refer to this collection of data as 'SynC' (Synthetic Corpus).</p>
<table align = "center">
<thead align = "center">
<tr>
<th>Pair</th>
<th># lines</th>
<th># sents</th>
<th># tokens</th>
<th># types</th>
</tr>
</thead>
<tbody align = "center">
<tr>
<td>KK↔EN</td>
<td>1,787,050</td>
<td>1,782,192<br>1,781,019</td>
<td>26,630,960<br>35,291,705</td>
<td>685,135<br>300,556</td>
</tr>
<tr>
<td>KK↔RU</td>
<td>1,787,448</td>
<td>1,782,192<br>1,777,500</td>
<td>26,654,195<br>30,241,895</td>
<td>685,135<br>672,146</td>
</tr>
<tr>
<td>KK↔TR</td>
<td>1,791,425</td>
<td>1,782,192<br>1,782,257</td>
<td>26,726,439<br>27,865,860</td>
<td>685,135<br>656,294</td>
</tr>
<tr>
<td>EN↔RU</td>
<td>1,784,513</td>
<td>1,781,019<br>1,777,500</td>
<td>35,244,800<br>30,175,611</td>
<td>300,556<br>672,146</td>
</tr>
<tr>
<td>EN↔TR</td>
<td>1,788,564</td>
<td>1,781,019<br>1,782,257</td>
<td>35,344,188<br>27,806,708</td>
<td>300,556<br>656,294</td>
</tr>
<tr>
<td>RU↔TR</td>
<td>1,788,027</td>
<td>1,777,500<br>1,782,257</td>
<td>30,269,083<br>27,816,210</td>
<td>672,146<br>656,294</td>
</tr>
</tbody>
</table>
<h2 align = "center">Data Splits</h2>
<h3 align = "center">KazParC</h3>
<p align = "justify">We first created a test set by randomly selecting 250 unique and non-repeating rows from each of the sources outlined in <a href = "#sources_domains">Data Sources and Domains</a>.
The remaining data were divided into language pairs, following an 80/20 split, while ensuring that the distribution of domains was maintained within both the training and validation sets.</p>
<table align = "center">
<thead align = "center">
<tr>
<th rowspan="3">Pair</th>
<th colspan="4">Train</th>
<th colspan="4">Valid</th>
<th colspan="4">Test</th>
</tr>
<tr>
<th>#<br>lines</th>
<th>#<br>sents</th>
<th>#<br>tokens</th>
<th>#<br>types</th>
<th>#<br>lines</th>
<th>#<br>sents</th>
<th>#<br>tokens</th>
<th>#<br>lines</th>
<th>#<br>lines</th>
<th>#<br>sents</th>
<th>#<br>tokens</th>
<th>#<br>lines</th>
</tr>
</thead>
<tbody align = "center">
<tr>
<td>KK↔EN</td>
<td>290,877</td>
<td>286,958<br>286,197</td>
<td>3,693,263<br>5,057,687</td>
<td>164,766<br>54,311</td>
<td>72,719</td>
<td>72,426 <br>72,403</td>
<td>920,482<br>1,259,827</td>
<td>83,057<br>32,063</td>
<td>4,750</td>
<td>4,750 <br>4,750</td>
<td>57,044<br>75,867</td>
<td>17,475<br>9,729</td>
</tr>
<tr>
<td>KK↔RU</td>
<td>290,785</td>
<td>286,943 <br>287,215</td>
<td>3,689,799<br>3,945,741</td>
<td>164,995<br>165,882</td>
<td>72,697</td>
<td>72,413<br>72,439</td>
<td>923,750<br>988,374</td>
<td>82,958<br>87,519</td>
<td>4,750</td>
<td>4,750 <br>4,750</td>
<td>57,044<br>61,916</td>
<td>17,475<br>18,804</td>
</tr>
<tr>
<td>KK↔TR</td>
<td>289,720</td>
<td>286,694 <br>286,279</td>
<td>3,691,751<br>3,626,361</td>
<td>164,961<br>157,460</td>
<td>72,430</td>
<td>72,211 <br>72,190</td>
<td>920,057<br>904,199</td>
<td>82,698<br>80,885</td>
<td>4,750</td>
<td>4,750 <br>4,750</td>
<td>57,044<br>55,861</td>
<td>17,475<br>17,284</td>
</tr>
<tr>
<td>EN↔RU</td>
<td>290,764</td>
<td>286,185 <br>287,261</td>
<td>5,058,530<br>3,950,362</td>
<td>54,322<br>165,701</td>
<td>72,692</td>
<td>72,377 <br>72,427</td>
<td>1,257,904<br>982,032</td>
<td>32,208<br>87,541</td>
<td>4,750</td>
<td>4,750 <br>4,750</td>
<td>75,867<br>61,916</td>
<td>9,729<br>18,804</td>
</tr>
<tr>
<td>EN↔TR</td>
<td>289,913</td>
<td>285,967<br>286,288</td>
<td>5,048,274<br>3,621,531</td>
<td>54,224<br>157,369</td>
<td>72,479</td>
<td>72,220 <br>72,219</td>
<td>1,256,562<br>901,983</td>
<td>32,269<br>80,838</td>
<td>4,750</td>
<td>4,750 <br>4,750</td>
<td>75,867<br>55,861</td>
<td>9,729<br>17,284</td>
</tr>
<tr>
<td>RU↔TR</td>
<td>290,899</td>
<td>287,241 <br>286,475</td>
<td>3,947,809<br>3,626,436</td>
<td>165,482<br>157,470</td>
<td>72,725</td>
<td>72,455<br>72,362</td>
<td>990,125<br>909,550</td>
<td>87,831<br>80,962</td>
<td>4,750</td>
<td>4,750 <br>4,750</td>
<td>61,916<br>55,861</td>
<td>18,804<br>17,284</td>
</tr>
</tbody>
</table>
<h3 align = "center">SynC</h3>
<p align = "justify">We divided the synthetic corpus into training and validation sets with a 90/10 ratio.</p>
<table align = "center">
<thead align = "center">
<tr>
<th rowspan="2">Pair</th>
<th colspan="4">Train</th>
<th colspan="4">Valid</th>
</tr>
<tr>
<th># lines</th>
<th># sents</th>
<th># tokens</th>
<th># types</th>
<th># lines</th>
<th># sents</th>
<th># tokens</th>
<th># types</th>
</tr>
</thead>
<tbody align = "center">
<tr>
<td>KK↔EN</td>
<td>1,608,345</td>
<td>1,604,414<br>1,603,426</td>
<td>23,970,260<br>31,767,617</td>
<td>650,144<br>286,372</td>
<td>178,705</td>
<td>178,654<br>178,639</td>
<td>2,660,700<br>3,524,088</td>
<td>208,838<br>105,517</td>
</tr>
<tr>
<td>KK↔RU</td>
<td>1,608,703</td>
<td>1,604,468<br>1,600,643</td>
<td>23,992,148<br>27,221,583</td>
<td>650,170<br>642,604</td>
<td>178,745</td>
<td>178,691<br>178,642</td>
<td>2,662,047<br>3,020,312</td>
<td>209,188<br>235,642</td>
</tr>
<tr>
<td>KK↔TR</td>
<td>1,612,282</td>
<td>1,604,793<br>1,604,822</td>
<td>24,053,671<br>25,078,688</td>
<td>650,384<br>626,724</td>
<td>179,143</td>
<td>179,057<br>179,057</td>
<td>2,672,768<br>2,787,172</td>
<td>209,549<br>221,773</td>
</tr>
<tr>
<td>EN↔RU</td>
<td>1,606,061</td>
<td>1,603,199<br>1,600,372</td>
<td>31,719,781<br>27,158,101</td>
<td>286,645<br>642,686</td>
<td>178,452</td>
<td>178,419<br>178,379</td>
<td>3,525,019<br>3,017,510</td>
<td>104,834<br>235,069</td>
</tr>
<tr>
<td>EN↔TR</td>
<td>1,609,707</td>
<td>1,603,636<br>1,604,545</td>
<td>31,805,393<br>25,022,782</td>
<td>286,387<br>626,740</td>
<td>178,857</td>
<td>178,775<br>178,796</td>
<td>3,538,795<br>2,783,926</td>
<td>105,641<br>221,372</td>
</tr>
<tr>
<td>RU↔TR</td>
<td>1,609,224</td>
<td>1,600,605<br>1,604,521</td>
<td>27,243,278<br>25,035,274</td>
<td>642,797<br>626,587</td>
<td>178,803</td>
<td>178,695<br>178,750</td>
<td>3,025,805<br>2,780,936</td>
<td>235,970<br>221,792</td>
</tr>
</tbody>
</table>
<h2 align = "center">Corpus Structure</h2>
<p align = "justify">The entire corpus</a> is organised into two distinct groups based on their file prefixes. Files "01" through "19" have the "kazparc" prefix, while Files "20" to "32" have the "sync" prefix.</p>
```
├── kazparc
├── 01_kazparc_all_entries.csv
├── 02_kazparc_train_kk_en.csv
├── 03_kazparc_train_kk_ru.csv
├── 04_kazparc_train_kk_tr.csv
├── 05_kazparc_train_en_ru.csv
├── 06_kazparc_train_en_tr.csv
├── 07_kazparc_train_ru_tr.csv
├── 08_kazparc_valid_kk_en.csv
├── 09_kazparc_valid_kk_ru.csv
├── 10_kazparc_valid_kk_tr.csv
├── 11_kazparc_valid_en_ru.csv
├── 12_kazparc_valid_en_tr.csv
├── 13_kazparc_valid_ru_tr.csv
├── 14_kazparc_test_kk_en.csv
├── 15_kazparc_test_kk_ru.csv
├── 16_kazparc_test_kk_tr.csv
├── 17_kazparc_test_en_ru.csv
├── 18_kazparc_test_en_tr.csv
├── 19_kazparc_test_ru_tr.csv
├── sync
├── 20_sync_all_entries.csv
├── 21_sync_train_kk_en.csv
├── 22_sync_train_kk_ru.csv
├── 23_sync_train_kk_tr.csv
├── 24_sync_train_en_ru.csv
├── 25_sync_train_en_tr.csv
├── 26_sync_train_ru_tr.csv
├── 27_sync_valid_kk_en.csv
├── 28_sync_valid_kk_ru.csv
├── 29_sync_valid_kk_tr.csv
├── 30_sync_valid_en_ru.csv
├── 31_sync_valid_en_tr.csv
├── 32_sync_valid_ru_tr.csv
```
<h3 align = "center">KazParC files</h3>
<ul>
<li>File "01" contains the original, unprocessed text data for the four languages considered within KazParC.
<li>Files "02" through "19" represent pre-processed texts divided into language pairs for training (Files "02" to "07"), validation (Files "08" to "13"), and testing (Files "14" to "19"). Language pairs are indicated within the filenames using two-letter language codes (e.g., kk_en).
</ul>
<h3 align = "center">SynC files</h3>
<ul>
<li>File "20" contains raw, unprocessed text data for the four languages.</li>
<li>Files "21" to "32" contain pre-processed text divided into language pairs for training (Files "21" to "26") and validation (Files "27" to "32") purposes.</li>
</ul>
<h3 align = "center">Data Fields</h3>
<p align = "justify">In both "01" and "20", each line consists of specific components:</p>
- `id`: the unique line identifier
- `kk`: the sentence in Kazakh
- `en`: the sentence in English
- `ru`: the sentence in Russian
- `tr`: the sentence in Turkish
- `domain`: the domain of the sentence
<p align = "justify">For the other files, the fields are:</p>
- `id`: the unique line identifier
- `source_lang`: the source language code
- `target_lang`: the target language code
- `domain`: the domain of the sentence
- `pair`: the language pair
<h2 align = "center">How to Use</h2>
To load the subsets of KazParC separately:
```python
from datasets import load_dataset
kazparc_raw = load_dataset("issai/kazparc", "kazparc_raw")
kazparc = load_dataset("issai/kazparc", "kazparc")
sync_raw = load_dataset("issai/kazparc", "sync_raw")
sync = load_dataset("issai/kazparc", "sync")
``` |
allenai/common_gen | ---
annotations_creators:
- crowdsourced
language_creators:
- found
- crowdsourced
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: commongen
pretty_name: CommonGen
tags:
- concepts-to-text
dataset_info:
features:
- name: concept_set_idx
dtype: int32
- name: concepts
sequence: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 6724166
num_examples: 67389
- name: validation
num_bytes: 408740
num_examples: 4018
- name: test
num_bytes: 77518
num_examples: 1497
download_size: 3434865
dataset_size: 7210424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "common_gen"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://inklab.usc.edu/CommonGen/index.html](https://inklab.usc.edu/CommonGen/index.html)
- **Repository:** https://github.com/INK-USC/CommonGen
- **Paper:** [CommonGen: A Constrained Text Generation Challenge for Generative Commonsense Reasoning](https://arxiv.org/abs/1911.03705)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.85 MB
- **Size of the generated dataset:** 7.21 MB
- **Total amount of disk used:** 9.06 MB
### Dataset Summary
CommonGen is a constrained text generation task, associated with a benchmark dataset,
to explicitly test machines for the ability of generative commonsense reasoning. Given
a set of common concepts; the task is to generate a coherent sentence describing an
everyday scenario using these concepts.
CommonGen is challenging because it inherently requires 1) relational reasoning using
background commonsense knowledge, and 2) compositional generalization ability to work
on unseen concept combinations. Our dataset, constructed through a combination of
crowd-sourcing from AMT and existing caption corpora, consists of 30k concept-sets and
50k sentences in total.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 1.85 MB
- **Size of the generated dataset:** 7.21 MB
- **Total amount of disk used:** 9.06 MB
An example of 'train' looks as follows.
```
{
"concept_set_idx": 0,
"concepts": ["ski", "mountain", "skier"],
"target": "Three skiers are skiing on a snowy mountain."
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `concept_set_idx`: a `int32` feature.
- `concepts`: a `list` of `string` features.
- `target`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default|67389| 4018|1497|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The dataset is licensed under [MIT License](https://github.com/INK-USC/CommonGen/blob/master/LICENSE).
### Citation Information
```bib
@inproceedings{lin-etal-2020-commongen,
title = "{C}ommon{G}en: A Constrained Text Generation Challenge for Generative Commonsense Reasoning",
author = "Lin, Bill Yuchen and
Zhou, Wangchunshu and
Shen, Ming and
Zhou, Pei and
Bhagavatula, Chandra and
Choi, Yejin and
Ren, Xiang",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.165",
doi = "10.18653/v1/2020.findings-emnlp.165",
pages = "1823--1840"
}
```
### Contributions
Thanks to [@JetRunner](https://github.com/JetRunner), [@yuchenlin](https://github.com/yuchenlin), [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
Kornberg/landsat_unfiltered | ---
dataset_info:
features:
- name: input
dtype: image
- name: target
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 1510993883.265
num_examples: 34085
download_size: 1485983470
dataset_size: 1510993883.265
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
conv_questions | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
language_bcp47:
- en-US
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
- text-generation
- fill-mask
task_ids:
- open-domain-qa
- dialogue-modeling
pretty_name: ConvQuestions
dataset_info:
features:
- name: domain
dtype: string
- name: seed_entity
dtype: string
- name: seed_entity_text
dtype: string
- name: questions
sequence: string
- name: answers
sequence:
sequence: string
- name: answer_texts
sequence: string
splits:
- name: train
num_bytes: 3589880
num_examples: 6720
- name: validation
num_bytes: 1241778
num_examples: 2240
- name: test
num_bytes: 1175656
num_examples: 2240
download_size: 3276017
dataset_size: 6007314
---
# Dataset Card for ConvQuestions
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [ConvQuestions page](https://convex.mpi-inf.mpg.de)
- **Repository:** [GitHub](https://github.com/PhilippChr/CONVEX)
- **Paper:** [Look before you hop: Conversational question answering over knowledge graphs using judicious context expansion](https://arxiv.org/abs/1910.03262)
- **Leaderboard:** [ConvQuestions leaderboard](https://convex.mpi-inf.mpg.de)
- **Point of Contact:** [Philipp Christmann](mailto:pchristm@mpi-inf.mpg.de)
### Dataset Summary
ConvQuestions is the first realistic benchmark for conversational question answering over
knowledge graphs. It contains 11,200 conversations which can be evaluated over Wikidata.
They are compiled from the inputs of 70 Master crowdworkers on Amazon Mechanical Turk,
with conversations from five domains: Books, Movies, Soccer, Music, and TV Series.
The questions feature a variety of complex question phenomena like comparisons, aggregations,
compositionality, and temporal reasoning. Answers are grounded in Wikidata entities to enable
fair comparison across diverse methods. The data gathering setup was kept as natural as
possible, with the annotators selecting entities of their choice from each of the five domains,
and formulating the entire conversation in one session. All questions in a conversation are
from the same Turker, who also provided gold answers to the questions. For suitability to knowledge
graphs, questions were constrained to be objective or factoid in nature, but no other restrictive
guidelines were set. A notable property of ConvQuestions is that several questions are not
answerable by Wikidata alone (as of September 2019), but the required facts can, for example,
be found in the open Web or in Wikipedia. For details, please refer to the CIKM 2019 full paper
(https://dl.acm.org/citation.cfm?id=3358016).
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
en
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```
{
'domain': 'music',
'seed_entity': 'https://www.wikidata.org/wiki/Q223495',
'seed_entity_text': 'The Carpenters',
'questions': [
'When did The Carpenters sign with A&M Records?',
'What song was their first hit?',
'When did Karen die?',
'Karen had what eating problem?',
'and how did she die?'
],
'answers': [
[
'1969'
],
[
'https://www.wikidata.org/wiki/Q928282'
],
[
'1983'
],
[
'https://www.wikidata.org/wiki/Q131749'
],
[
'https://www.wikidata.org/wiki/Q181754'
]
],
'answer_texts': [
'1969',
'(They Long to Be) Close to You',
'1983',
'anorexia nervosa',
'heart failure'
]
}
```
### Data Fields
- `domain`: a `string` feature. Any of: ['books', 'movies', 'music', 'soccer', 'tv_series']
- `seed_entity`: a `string` feature. Wikidata ID of the topic entity.
- `seed_entity_text`: a `string` feature. Surface form of the topic entity.
- `questions`: a `list` of `string` features. List of questions (initial question and follow-up questions).
- `answers`: a `list` of `lists` of `string` features. List of answers, given as Wikidata IDs or literals (e.g. timestamps or names).
- `answer_texts`: a `list` of `string` features. List of surface forms of the answers.
### Data Splits
|train|validation|tests|
|----:|---------:|----:|
| 6720| 2240| 2240|
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
With insights from a meticulous in-house pilot study with ten students over two weeks, the authors posed the conversation generation task on Amazon Mechanical Turk (AMT) in the most natural setup: Each crowdworker was asked to build a conversation by asking five sequential questions starting from any seed entity of his/her choice, as this is an intuitive mental model that humans may have when satisfying their real information needs via their search assistants.
#### Who are the annotators?
Local students (Saarland Informatics Campus) and AMT Master Workers.
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
The ConvQuestions benchmark is licensed under a Creative Commons Attribution 4.0 International License.
### Citation Information
```
@InProceedings{christmann2019look,
title={Look before you hop: Conversational question answering over knowledge graphs using judicious context expansion},
author={Christmann, Philipp and Saha Roy, Rishiraj and Abujabal, Abdalghani and Singh, Jyotsna and Weikum, Gerhard},
booktitle={Proceedings of the 28th ACM International Conference on Information and Knowledge Management},
pages={729--738},
year={2019}
}
```
### Contributions
Thanks to [@PhilippChr](https://github.com/PhilippChr) for adding this dataset. |
Karzan/tts-dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 129163929.0
num_examples: 53
download_size: 121573238
dataset_size: 129163929.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fdsghfdh/bjbhk | ---
license: openrail
---
|
InceptiveDev/Cover-Letters-Dataset | ---
license: mit
---
|
vwxyzjn/openhermes-dev-500-new-tokens__mistralai_Mixtral-8x7B-Instruct-v0.1__1707788532 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1_policy
dtype: string
splits:
- name: train
num_bytes: 35821291.0
num_examples: 10000
download_size: 20292136
dataset_size: 35821291.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jensjorisdecorte/anonymous-working-histories | ---
license: cc-by-4.0
task_categories:
- text-classification
language:
- en
tags:
- Career Path Prediction
pretty_name: Synthetic ESCO skill sentences
size_categories:
- 1K<n<10K
---
# Structured Anonymous Career Paths extracted from Resumes
## Dataset Description
- **Homepage:** coming soon
- **Repository:** coming soon
- **Paper:** coming soon
- **Point of Contact:** jensjoris@techwolf.ai
### Dataset Summary
This dataset contains 2164 anonymous career paths across 24 differend industries.
Each work experience is tagger with their corresponding ESCO occupation (ESCO v1.1.1).
### Languages
We use the English version of ESCO.
All resume data is in English as well.
## Dataset Structure
Each working history contains up to 17 experiences.
They appear in order, and each experience has a title, description, start, and, ESCO uri and ESCO title field.
### Citation Information
[More Information Needed] |
open-llm-leaderboard/details_YeungNLP__LongQLoRA-Vicuna-13b-8k | ---
pretty_name: Evaluation run of YeungNLP/LongQLoRA-Vicuna-13b-8k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/LongQLoRA-Vicuna-13b-8k](https://huggingface.co/YeungNLP/LongQLoRA-Vicuna-13b-8k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__LongQLoRA-Vicuna-13b-8k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T20:09:15.984207](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__LongQLoRA-Vicuna-13b-8k/blob/main/results_2023-12-18T20-09-15.984207.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5363588804043325,\n\
\ \"acc_stderr\": 0.03398265746601784,\n \"acc_norm\": 0.5419352215266651,\n\
\ \"acc_norm_stderr\": 0.03471266124009366,\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.4707041581162466,\n\
\ \"mc2_stderr\": 0.014774260072447868\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.53839590443686,\n \"acc_stderr\": 0.01456824555029636,\n\
\ \"acc_norm\": 0.5639931740614335,\n \"acc_norm_stderr\": 0.014491225699230916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6042620991834295,\n\
\ \"acc_stderr\": 0.004880092083408043,\n \"acc_norm\": 0.8104959171479785,\n\
\ \"acc_norm_stderr\": 0.0039110756628832725\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777474,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777474\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376907,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.027709359675032495,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.027709359675032495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807264,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807264\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.032437180551374116,\n\
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.032437180551374116\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6972477064220184,\n \"acc_stderr\": 0.01969871143475634,\n \"\
acc_norm\": 0.6972477064220184,\n \"acc_norm_stderr\": 0.01969871143475634\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.026700545424943684,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.026700545424943684\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.027460099557005128,\n\
\ \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.027460099557005128\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159615,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159615\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41003911342894395,\n\
\ \"acc_stderr\": 0.01256183762196204,\n \"acc_norm\": 0.41003911342894395,\n\
\ \"acc_norm_stderr\": 0.01256183762196204\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5245098039215687,\n \"acc_stderr\": 0.020203517280261443,\n \
\ \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.020203517280261443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.4707041581162466,\n\
\ \"mc2_stderr\": 0.014774260072447868\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2304776345716452,\n \
\ \"acc_stderr\": 0.011600249020595834\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/LongQLoRA-Vicuna-13b-8k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|arc:challenge|25_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|gsm8k|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hellaswag|10_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T20-09-15.984207.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T20-09-15.984207.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- '**/details_harness|winogrande|5_2023-12-18T20-09-15.984207.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T20-09-15.984207.parquet'
- config_name: results
data_files:
- split: 2023_12_18T20_09_15.984207
path:
- results_2023-12-18T20-09-15.984207.parquet
- split: latest
path:
- results_2023-12-18T20-09-15.984207.parquet
---
# Dataset Card for Evaluation run of YeungNLP/LongQLoRA-Vicuna-13b-8k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YeungNLP/LongQLoRA-Vicuna-13b-8k](https://huggingface.co/YeungNLP/LongQLoRA-Vicuna-13b-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__LongQLoRA-Vicuna-13b-8k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T20:09:15.984207](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__LongQLoRA-Vicuna-13b-8k/blob/main/results_2023-12-18T20-09-15.984207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5363588804043325,
"acc_stderr": 0.03398265746601784,
"acc_norm": 0.5419352215266651,
"acc_norm_stderr": 0.03471266124009366,
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.4707041581162466,
"mc2_stderr": 0.014774260072447868
},
"harness|arc:challenge|25": {
"acc": 0.53839590443686,
"acc_stderr": 0.01456824555029636,
"acc_norm": 0.5639931740614335,
"acc_norm_stderr": 0.014491225699230916
},
"harness|hellaswag|10": {
"acc": 0.6042620991834295,
"acc_stderr": 0.004880092083408043,
"acc_norm": 0.8104959171479785,
"acc_norm_stderr": 0.0039110756628832725
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777474,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376907,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.027709359675032495,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.027709359675032495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807264,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807264
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.032437180551374116,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.032437180551374116
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6972477064220184,
"acc_stderr": 0.01969871143475634,
"acc_norm": 0.6972477064220184,
"acc_norm_stderr": 0.01969871143475634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.026700545424943684,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.026700545424943684
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.027460099557005128,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.027460099557005128
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159615,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159615
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41003911342894395,
"acc_stderr": 0.01256183762196204,
"acc_norm": 0.41003911342894395,
"acc_norm_stderr": 0.01256183762196204
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.020203517280261443,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.020203517280261443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.4707041581162466,
"mc2_stderr": 0.014774260072447868
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
},
"harness|gsm8k|5": {
"acc": 0.2304776345716452,
"acc_stderr": 0.011600249020595834
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TIGER-Lab/MathInstruct | ---
license: mit
task_categories:
- text-generation
language:
- en
pretty_name: MathInstruct
size_categories:
- 100K<n<1M
tags:
- math
---
# 🦣 MAmmoTH: Building Math Generalist Models through Hybrid Instruction Tuning
MathInstruct is a meticulously curated instruction tuning dataset that is lightweight yet generalizable. MathInstruct is compiled from 13 math rationale datasets, six of which are newly curated by this work. It uniquely focuses on the hybrid use of chain-of-thought (CoT) and program-of-thought (PoT) rationales, and ensures extensive coverage of diverse mathematical fields.
Project Page: [https://tiger-ai-lab.github.io/MAmmoTH/](https://tiger-ai-lab.github.io/MAmmoTH/)
Paper: [https://arxiv.org/pdf/2309.05653.pdf](https://arxiv.org/pdf/2309.05653.pdf)
Code: [https://github.com/TIGER-AI-Lab/MAmmoTH](https://github.com/TIGER-AI-Lab/MAmmoTH)
Models:
| | **Base Model: Llama-2** | **Base Model: Code Llama** |
|-----|---------------------------------------------------------------|--------------------------------------------------------------------------|
| 7B | 🦣 [MAmmoTH-7B](https://huggingface.co/TIGER-Lab/MAmmoTH-7B) | 🦣 [MAmmoTH-Coder-7B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-7B) |
| 13B | 🦣 [MAmmoTH-13B](https://huggingface.co/TIGER-Lab/MAmmoTH-13B) | 🦣 [MAmmoTH-Coder-13B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-13B)|
| 34B | - | 🦣 [MAmmoTH-Coder-34B](https://huggingface.co/TIGER-Lab/MAmmoTH-Coder-34B)|
| 70B | 🦣 [MAmmoTH-70B](https://huggingface.co/TIGER-Lab/MAmmoTH-70B) | - |
## **License**
Please check out the license of each subset in our curated dataset MathInstruct.
| Dataset Name | License Type |
|--------------|----------------|
| GSM8K | MIT |
| GSM8K-RFT | Non listed |
| AQuA-RAT | Apache 2.0 |
| MATH | MIT |
| TheoremQA | MIT |
| Camel-Math | Attribution-NonCommercial 4.0 International |
| NumGLUE | Apache-2.0 |
| MathQA | Apache-2.0 |
| Our Curated | MIT |
## **Citation**
Please cite our paper if you use our data, model or code. Please also kindly cite the original dataset papers.
```
@article{yue2023mammoth,
title={MAmmoTH: Building Math Generalist Models through Hybrid Instruction Tuning},
author={Xiang Yue, Xingwei Qu, Ge Zhang, Yao Fu, Wenhao Huang, Huan Sun, Yu Su, Wenhu Chen},
journal={arXiv preprint arXiv:2309.05653},
year={2023}
}
``` |
adrionthiago/teste | ---
license: openrail
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-80000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 640997
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lint/danbooru_tags | ---
license: mit
---
License is for code inside this repository
This dataset consists of tag strings for Danbooru image posts. I originally extracted the tag strings to generate a synthetic dataset of anime styled images at https://github.com/1lint/anybooru for tuning stable diffusion. The tag strings could also be used to train a language model for generating prompts for anime styled stable diffusion checkpoints. I hope this dataset can be of use to you!
Quick start
```
from datasets import load_dataset
data_files = {"train": "2021_0_pruned.parquet"}
dataset = load_dataset("lint/danbooru_tags", data_files=data_files)
print(dataset['train'][0])
```
The data is stored as pandas dataframes in parquet format, with the filename specifying the year, section number and whether the data was pruned.
The full dataframe contains all metadata fields collected by Gwern, while the pruned data contains only the tag string and post ID fields: ['tags', 'id'], where the submission score is > 2 and rated as (relatively) SFW.
If it's helpful, I can also upload similar extracted tags for other years of metadata, just leave a request in the `community tab`. You can also extract the tags and/or filter the dataset using the included notebook.
See the `generate_tags_dataset.ipynb` for how the tags dataset was extracted from Danbooru metadata collected by Gwern in https://gwern.net/danbooru2021.
@misc{danbooru2021, author = {Anonymous and Danbooru community and Gwern Branwen}, title = {Danbooru2021: A Large-Scale Crowdsourced and Tagged Anime Illustration Dataset}, howpublished = {\url{https://gwern.net/danbooru2021}}, url = {https://gwern.net/danbooru2021}, type = {dataset}, year = {2022}, month = {January}, timestamp = {2022-01-21}, note = {Accessed: 03/01/2023} } |
open-llm-leaderboard/details_ChaoticNeutrals__Cookie_7B | ---
pretty_name: Evaluation run of ChaoticNeutrals/Cookie_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChaoticNeutrals/Cookie_7B](https://huggingface.co/ChaoticNeutrals/Cookie_7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__Cookie_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T07:19:11.368453](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Cookie_7B/blob/main/results_2024-02-21T07-19-11.368453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6489384774347392,\n\
\ \"acc_stderr\": 0.032131421920616465,\n \"acc_norm\": 0.6498781521461111,\n\
\ \"acc_norm_stderr\": 0.032783132740631354,\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6687534212220169,\n\
\ \"mc2_stderr\": 0.015263939252034519\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.01364094309194653,\n\
\ \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n\
\ \"acc_stderr\": 0.004525960965551707,\n \"acc_norm\": 0.8757219677355108,\n\
\ \"acc_norm_stderr\": 0.0032922425436373417\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887027,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887027\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46368715083798884,\n\
\ \"acc_stderr\": 0.016678341894533166,\n \"acc_norm\": 0.46368715083798884,\n\
\ \"acc_norm_stderr\": 0.016678341894533166\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6687534212220169,\n\
\ \"mc2_stderr\": 0.015263939252034519\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676207\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6118271417740713,\n \
\ \"acc_stderr\": 0.013423607564002757\n }\n}\n```"
repo_url: https://huggingface.co/ChaoticNeutrals/Cookie_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|arc:challenge|25_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|gsm8k|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hellaswag|10_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T07-19-11.368453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T07-19-11.368453.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- '**/details_harness|winogrande|5_2024-02-21T07-19-11.368453.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T07-19-11.368453.parquet'
- config_name: results
data_files:
- split: 2024_02_21T07_19_11.368453
path:
- results_2024-02-21T07-19-11.368453.parquet
- split: latest
path:
- results_2024-02-21T07-19-11.368453.parquet
---
# Dataset Card for Evaluation run of ChaoticNeutrals/Cookie_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChaoticNeutrals/Cookie_7B](https://huggingface.co/ChaoticNeutrals/Cookie_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__Cookie_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T07:19:11.368453](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Cookie_7B/blob/main/results_2024-02-21T07-19-11.368453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6489384774347392,
"acc_stderr": 0.032131421920616465,
"acc_norm": 0.6498781521461111,
"acc_norm_stderr": 0.032783132740631354,
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6687534212220169,
"mc2_stderr": 0.015263939252034519
},
"harness|arc:challenge|25": {
"acc": 0.6791808873720137,
"acc_stderr": 0.01364094309194653,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.7105158334993029,
"acc_stderr": 0.004525960965551707,
"acc_norm": 0.8757219677355108,
"acc_norm_stderr": 0.0032922425436373417
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887027,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887027
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46368715083798884,
"acc_stderr": 0.016678341894533166,
"acc_norm": 0.46368715083798884,
"acc_norm_stderr": 0.016678341894533166
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6687534212220169,
"mc2_stderr": 0.015263939252034519
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.010941877955676207
},
"harness|gsm8k|5": {
"acc": 0.6118271417740713,
"acc_stderr": 0.013423607564002757
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
davidpistori/myprovence | ---
license: apache-2.0
---
|
Keviv123/resume_dataset | ---
license: cc0-1.0
---
|
ravithejads/teknium_GPTeacher_general_instruct_telugu_filtered_and_romanized | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 132653766
num_examples: 87624
download_size: 57183930
dataset_size: 132653766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ruffusplay/ajolote | ---
license: openrail
---
|
FabioInspires/hoodies | ---
license: openrail
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 82775.0
num_examples: 10
download_size: 82557
dataset_size: 82775.0
---
|
softcatala/Tilde-MODEL-Catalan | ---
annotations_creators: []
language_creators:
- machine-generated
language:
- ca
- de
license:
- cc-by-4.0
multilinguality:
- translation
size_categories:
- 1M<n<10M
source_datasets:
- extended|tilde_model
task_categories:
- text2text-generation
- translation
task_ids: []
pretty_name: Catalan-German aligned corpora to train NMT systems.
tags:
- conditional-text-generation
---
# Dataset Card for Tilde-MODEL-Catalan
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.softcatala.org/
- **Repository:** https://github.com/Softcatala/Tilde-MODEL-catalan
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains the German version of the Tilde-MODEL corpus aligned with a Catalan translation.
The catalan text has been obtained using Apertium's RBMT system from the Spanish version. It cotains 3.4M segments.
### Supported Tasks and Leaderboards
This dataset can be used to train NMT and SMT systems.
It has been used as a training corpus for the [Softcatalà machine translation engine](https://www.softcatala.org/traductor/).
### Languages
Catalan (`ca`).
German (`de`).
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
Raw text.
### Data Splits
One file for language.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[@softcatala](https://github.com/Softcatala)
[@jordimas](https://github.com/jordimas)
[@davidcanovas](https://github.com/davidcanovas)
### Licensing Information
[CC BY 4.0](https://creativecommons.org/licenses/by/4.0/).
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
CyberHarem/ch_en_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ch'en/チェン/陈 (Arknights)
This is the dataset of ch'en/チェン/陈 (Arknights), containing 500 images and their tags.
The core tags of this character are `horns, blue_hair, dragon_horns, long_hair, red_eyes, hair_between_eyes, breasts, dragon_tail, tail, medium_breasts, twintails, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1008.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ch_en_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 449.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ch_en_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1337 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ch_en_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 824.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ch_en_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1337 | 1.58 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ch_en_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ch_en_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, holding_sword, short_shorts, solo, white_shirt, black_gloves, black_jacket, black_shorts, looking_at_viewer, open_jacket, yellow_necktie, fingerless_gloves, sleeveless_shirt, bare_shoulders, off_shoulder, cowboy_shot, belt, thighs, navel, collared_shirt, standing, low_twintails |
| 1 | 5 |  |  |  |  |  | 1girl, black_footwear, black_gloves, black_jacket, black_shorts, fingerless_gloves, looking_at_viewer, open_jacket, shin_guards, short_shorts, solo, white_shirt, yellow_necktie, boots, closed_mouth, full_body, holding_sword, low_twintails, collared_shirt, thighs, sitting, standing |
| 2 | 22 |  |  |  |  |  | 1girl, solo, white_shirt, yellow_necktie, upper_body, black_jacket, looking_at_viewer, open_jacket, off_shoulder, sleeveless_shirt, bare_shoulders, collared_shirt, low_twintails, closed_mouth, simple_background, white_background |
| 3 | 12 |  |  |  |  |  | 1girl, bare_shoulders, black_bikini, blue_sky, cloud, day, navel, official_alternate_costume, outdoors, solo, standing, stomach, bare_arms, baseball_cap, cowboy_shot, horns_through_headwear, looking_at_viewer, open_fly, grey_shorts, short_shorts, white_headwear, collarbone, pouch, highleg_bikini, thighs, water, halterneck, beach, belt, ocean, wristwatch |
| 4 | 8 |  |  |  |  |  | 1girl, bare_arms, bare_shoulders, black_bikini, highleg_bikini, looking_at_viewer, navel, official_alternate_costume, short_shorts, simple_background, solo, stomach, thighs, white_background, baseball_cap, horns_through_headwear, open_fly, sunglasses, cowboy_shot, grey_shorts, halterneck, holding_removed_eyewear, pouch, standing, unworn_eyewear, white_headwear, belt, folded_ponytail, heart, large_breasts, collarbone, white_shorts, wristwatch |
| 5 | 5 |  |  |  |  |  | 1girl, bare_shoulders, china_dress, cleavage_cutout, double_bun, looking_at_viewer, official_alternate_costume, sleeveless_dress, solo, red_dress, smile, upper_body, bare_arms, large_breasts, hand_up, red_background |
| 6 | 23 |  |  |  |  |  | 1girl, bare_arms, bare_shoulders, china_dress, double_bun, looking_at_viewer, official_alternate_costume, red_dress, sleeveless_dress, solo, black_shorts, thigh_strap, short_shorts, thighs, white_background, cleavage_cutout, bead_bracelet, simple_background, bare_legs, red_footwear, sitting, high_heels, hand_up, large_breasts |
| 7 | 5 |  |  |  |  |  | 1girl, black_dress, closed_mouth, enmaided, frills, large_breasts, maid_apron, simple_background, solo, white_apron, white_background, blush, dragon_girl, juliet_sleeves, looking_at_viewer, maid_headdress, cowboy_shot, from_side, holding, standing, upper_body |
| 8 | 5 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, solo, alternate_costume, looking_at_viewer, black_dress, cleavage, low_twintails, sitting, dragon_girl, playing_instrument, purple_eyes, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | holding_sword | short_shorts | solo | white_shirt | black_gloves | black_jacket | black_shorts | looking_at_viewer | open_jacket | yellow_necktie | fingerless_gloves | sleeveless_shirt | bare_shoulders | off_shoulder | cowboy_shot | belt | thighs | navel | collared_shirt | standing | low_twintails | black_footwear | shin_guards | boots | closed_mouth | full_body | sitting | upper_body | simple_background | white_background | black_bikini | blue_sky | cloud | day | official_alternate_costume | outdoors | stomach | bare_arms | baseball_cap | horns_through_headwear | open_fly | grey_shorts | white_headwear | collarbone | pouch | highleg_bikini | water | halterneck | beach | ocean | wristwatch | sunglasses | holding_removed_eyewear | unworn_eyewear | folded_ponytail | heart | large_breasts | white_shorts | china_dress | cleavage_cutout | double_bun | sleeveless_dress | red_dress | smile | hand_up | red_background | thigh_strap | bead_bracelet | bare_legs | red_footwear | high_heels | black_dress | enmaided | frills | maid_apron | white_apron | blush | dragon_girl | juliet_sleeves | maid_headdress | from_side | holding | long_sleeves | alternate_costume | cleavage | playing_instrument | purple_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:---------------|:-------|:--------------|:---------------|:---------------|:---------------|:--------------------|:--------------|:-----------------|:--------------------|:-------------------|:-----------------|:---------------|:--------------|:-------|:---------|:--------|:-----------------|:-----------|:----------------|:-----------------|:--------------|:--------|:---------------|:------------|:----------|:-------------|:--------------------|:-------------------|:---------------|:-----------|:--------|:------|:-----------------------------|:-----------|:----------|:------------|:---------------|:-------------------------|:-----------|:--------------|:-----------------|:-------------|:--------|:-----------------|:--------|:-------------|:--------|:--------|:-------------|:-------------|:--------------------------|:-----------------|:------------------|:--------|:----------------|:---------------|:--------------|:------------------|:-------------|:-------------------|:------------|:--------|:----------|:-----------------|:--------------|:----------------|:------------|:---------------|:-------------|:--------------|:-----------|:---------|:-------------|:--------------|:--------|:--------------|:-----------------|:-----------------|:------------|:----------|:---------------|:--------------------|:-----------|:---------------------|:--------------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 22 |  |  |  |  |  | X | | | X | X | | X | | X | X | X | | X | X | X | | | | | X | | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | | X | X | | | | | X | | | | | X | | X | X | X | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | | | | | X | | | | | X | | X | X | X | X | | X | | | | | | | | | X | X | X | | | | X | | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | | | | X | | | | | X | | | | | | | | | | | | | | | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 6 | 23 |  |  |  |  |  | X | | X | X | | | | X | X | | | | | X | | | | X | | | | | | | | | | X | | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | X | | | | | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | | | | | | | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | X | | | | | X | X | X | X | X |
|
open-llm-leaderboard/details_DreadPoor__Westuccine-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/Westuccine-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/Westuccine-7B-slerp](https://huggingface.co/DreadPoor/Westuccine-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__Westuccine-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T03:50:45.691578](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Westuccine-7B-slerp/blob/main/results_2024-01-27T03-50-45.691578.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6400614569373341,\n\
\ \"acc_stderr\": 0.03244478621301286,\n \"acc_norm\": 0.6429052954858089,\n\
\ \"acc_norm_stderr\": 0.03310193291140016,\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6934215572473478,\n\
\ \"mc2_stderr\": 0.015166987873604024\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n\
\ \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276516\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7146982672774348,\n\
\ \"acc_stderr\": 0.004506351723820961,\n \"acc_norm\": 0.8734315873332006,\n\
\ \"acc_norm_stderr\": 0.0033180935797029205\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101737,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n\
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530302,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530302\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657569,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657569\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\
\ \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n\
\ \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388995,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388995\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621355,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621355\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675592,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675592\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6934215572473478,\n\
\ \"mc2_stderr\": 0.015166987873604024\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4852160727824109,\n \
\ \"acc_stderr\": 0.013766463050787596\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/Westuccine-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|arc:challenge|25_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|arc:challenge|25_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|gsm8k|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|gsm8k|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hellaswag|10_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hellaswag|10_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T03-45-30.618423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T03-50-45.691578.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T03-50-45.691578.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- '**/details_harness|winogrande|5_2024-01-27T03-45-30.618423.parquet'
- split: 2024_01_27T03_50_45.691578
path:
- '**/details_harness|winogrande|5_2024-01-27T03-50-45.691578.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T03-50-45.691578.parquet'
- config_name: results
data_files:
- split: 2024_01_27T03_45_30.618423
path:
- results_2024-01-27T03-45-30.618423.parquet
- split: 2024_01_27T03_50_45.691578
path:
- results_2024-01-27T03-50-45.691578.parquet
- split: latest
path:
- results_2024-01-27T03-50-45.691578.parquet
---
# Dataset Card for Evaluation run of DreadPoor/Westuccine-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/Westuccine-7B-slerp](https://huggingface.co/DreadPoor/Westuccine-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__Westuccine-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T03:50:45.691578](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Westuccine-7B-slerp/blob/main/results_2024-01-27T03-50-45.691578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6400614569373341,
"acc_stderr": 0.03244478621301286,
"acc_norm": 0.6429052954858089,
"acc_norm_stderr": 0.03310193291140016,
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6934215572473478,
"mc2_stderr": 0.015166987873604024
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497723,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276516
},
"harness|hellaswag|10": {
"acc": 0.7146982672774348,
"acc_stderr": 0.004506351723820961,
"acc_norm": 0.8734315873332006,
"acc_norm_stderr": 0.0033180935797029205
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530302,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530302
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657569,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.01665722942458631,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.01665722942458631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388995,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388995
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621355,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675592,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6934215572473478,
"mc2_stderr": 0.015166987873604024
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
},
"harness|gsm8k|5": {
"acc": 0.4852160727824109,
"acc_stderr": 0.013766463050787596
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
eswardivi/Tamil_MSA_Audio_Text | ---
dataset_info:
features:
- name: Audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
- name: FilePath
dtype: string
- name: Text
dtype: string
splits:
- name: train
num_bytes: 436903500
num_examples: 64
download_size: 435262950
dataset_size: 436903500
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-classification
- audio-classification
language:
- ta
size_categories:
- n<1K
---
# Dataset Details
- **Title:** Dravidianmultimodality: A dataset for multi-modal sentiment analysis in Tamil and Malayalam
- **Authors:** Bharathi Raja Chakravarthi et al.
- **Link to Paper:** [arXiv:2106.04853](https://arxiv.org/abs/2106.04853)
- **Published:** 2021
- **Source:** arXiv preprint
|
davidadamczyk/weather_experiment | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 25953.6
num_examples: 150
- name: test
num_bytes: 17302.4
num_examples: 100
download_size: 31007
dataset_size: 43256.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Sk4467/Intrux_dataset_2 | ---
license: apache-2.0
---
|
darrow-ai/LegalLensNLI | ---
license: apache-2.0
task_categories:
- zero-shot-classification
- text-classification
language:
- en
tags:
- legal
- legalnlp
- class action
pretty_name: JusticeLens
size_categories:
- n<1K
---
- **Homepage:** https://www.darrow.ai/
- **Repository:** https://github.com/darrow-labs/LegalLens
- **Paper:** https://arxiv.org/pdf/2402.04335.pdf
- **Point of Contact:** [Dor Bernsohn](mailto:dor.bernsohn@darrow.ai),[Gil Semo](mailto:gil.semo@darrow.ai)
## Overview
The LegalLensNLI dataset is a unique collection of entries designed to show the connection between legal cases and the people affected by them. It's specially made for machine learning tools that aim to investigate more in the area of legal violations, specifically class action complaints. The main goal is to find people who have been harmed by certain legal cases and to help them get their compensation claims processed.
Each row in the dataset contains three key elements:
- **Premise**: This is a concise summary of an actual class action case, carefully summarized to highlight the core legal issue at hand.
- **Hypothesis**: An artificially generated text resembling a complaint or commentary as one might find on social media platforms like Reddit, Twitter, or various blog posts. This text is designed to reflect individual expressions or reports related to the summarized case.
- **Label**: The relationship between the premise and the hypothesis.
## Structure
The repository is structured to facilitate ease of access and utility:
- `LegalLensNLI.csv`: The primary dataset file that includes all the legal domain data.
- `mnli-by-legal-act`: This directory further categorizes the data into specific legal domains and contains separate `train`, `test`, and `validation` files for each domain to support machine learning tasks.
## Data Fields
- **premise**: (str) The summarized background information or context extracted from legal documents, providing the setting or facts upon which the legal reasoning is based.
- **hypothesis**: (str) A statement derived from the premise that represents a possible scenario or assertion that is to be evaluated for its truthfulness within the context of the given premise.
- **legal_act**: (str) The specific legal act or statute that is relevant to the premise and hypothesis, indicating the area of law in question.
- **label**: (int) The classification label assigned to the relationship between the premise and the hypothesis, which typically indicates whether the hypothesis is entailed, contradicted, or neutral based on the premise within the legal context.
## Curation Rationale
The dataset was curated by Darrow.ai (2023).
## Data Instances
Here is how you can load the dataset:
```python
from datasets import load_dataset
dataset = load_dataset("darrow-ai/LegalLensNLI")
```
### Citation Information
*TBD
*LegalLens: Leveraging LLMs for Legal Violation Identification in Unstructured Text*
*Proceedings of the 2024 European Chapter of the Association for Computational Linguistics. Malta. 2024*
```
@InProceedings TBD
``` |
cos_e | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|commonsense_qa
task_categories:
- question-answering
task_ids:
- open-domain-qa
paperswithcode_id: cos-e
pretty_name: Commonsense Explanations
dataset_info:
- config_name: v1.0
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: string
- name: abstractive_explanation
dtype: string
- name: extractive_explanation
dtype: string
splits:
- name: train
num_bytes: 2067971
num_examples: 7610
- name: validation
num_bytes: 260669
num_examples: 950
download_size: 1588340
dataset_size: 2328640
- config_name: v1.11
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: string
- name: abstractive_explanation
dtype: string
- name: extractive_explanation
dtype: string
splits:
- name: train
num_bytes: 2702777
num_examples: 9741
- name: validation
num_bytes: 329897
num_examples: 1221
download_size: 1947552
dataset_size: 3032674
configs:
- config_name: v1.0
data_files:
- split: train
path: v1.0/train-*
- split: validation
path: v1.0/validation-*
- config_name: v1.11
data_files:
- split: train
path: v1.11/train-*
- split: validation
path: v1.11/validation-*
---
# Dataset Card for "cos_e"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/salesforce/cos-e
- **Paper:** [Explain Yourself! Leveraging Language Models for Commonsense Reasoning](https://arxiv.org/abs/1906.02361)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 10.83 MB
- **Size of the generated dataset:** 5.39 MB
- **Total amount of disk used:** 16.22 MB
### Dataset Summary
Common Sense Explanations (CoS-E) allows for training language models to
automatically generate explanations that can be used during training and
inference in a novel Commonsense Auto-Generated Explanation (CAGE) framework.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### v1.0
- **Size of downloaded dataset files:** 4.30 MB
- **Size of the generated dataset:** 2.34 MB
- **Total amount of disk used:** 6.64 MB
An example of 'train' looks as follows.
```
{
"abstractive_explanation": "this is open-ended",
"answer": "b",
"choices": ["a", "b", "c"],
"extractive_explanation": "this is selected train",
"id": "42",
"question": "question goes here."
}
```
#### v1.11
- **Size of downloaded dataset files:** 6.53 MB
- **Size of the generated dataset:** 3.05 MB
- **Total amount of disk used:** 9.58 MB
An example of 'train' looks as follows.
```
{
"abstractive_explanation": "this is open-ended",
"answer": "b",
"choices": ["a", "b", "c"],
"extractive_explanation": "this is selected train",
"id": "42",
"question": "question goes here."
}
```
### Data Fields
The data fields are the same among all splits.
#### v1.0
- `id`: a `string` feature.
- `question`: a `string` feature.
- `choices`: a `list` of `string` features.
- `answer`: a `string` feature.
- `abstractive_explanation`: a `string` feature.
- `extractive_explanation`: a `string` feature.
#### v1.11
- `id`: a `string` feature.
- `question`: a `string` feature.
- `choices`: a `list` of `string` features.
- `answer`: a `string` feature.
- `abstractive_explanation`: a `string` feature.
- `extractive_explanation`: a `string` feature.
### Data Splits
|name |train|validation|
|-----|----:|---------:|
|v1.0 | 7610| 950|
|v1.11| 9741| 1221|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Unknown.
### Citation Information
```
@inproceedings{rajani2019explain,
title = "Explain Yourself! Leveraging Language models for Commonsense Reasoning",
author = "Rajani, Nazneen Fatema and
McCann, Bryan and
Xiong, Caiming and
Socher, Richard",
year="2019",
booktitle = "Proceedings of the 2019 Conference of the Association for Computational Linguistics (ACL2019)",
url ="https://arxiv.org/abs/1906.02361"
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf), [@mariamabarham](https://github.com/mariamabarham), [@patrickvonplaten](https://github.com/patrickvonplaten), [@albertvillanova](https://github.com/albertvillanova), [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
zolak/twitter_dataset_79_1713216160 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1417485
num_examples: 3434
download_size: 712161
dataset_size: 1417485
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ahazeemi/iwslt14-en-fr | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
- name: en
dtype: string
- name: fr
dtype: string
splits:
- name: train
num_bytes: 79526838
num_examples: 179435
- name: validation
num_bytes: 441724
num_examples: 903
- name: test
num_bytes: 1551384
num_examples: 3666
download_size: 48209615
dataset_size: 81519946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
stoddur/medication_chat_commands_with_med_name | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 337663524.0
num_examples: 220407
download_size: 12130894
dataset_size: 337663524.0
---
# Dataset Card for "medication_chat_commands_with_med_name"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wtcherr/unsplash_10k_blur_rand_KS | ---
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3261835252.0
num_examples: 10000
download_size: 3261738834
dataset_size: 3261835252.0
---
# Dataset Card for "unsplash_10k_blur_rand_KS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yyh0901__lloma_step400 | ---
pretty_name: Evaluation run of yyh0901/lloma_step400
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yyh0901/lloma_step400](https://huggingface.co/yyh0901/lloma_step400) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yyh0901__lloma_step400\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T13:16:47.684336](https://huggingface.co/datasets/open-llm-leaderboard/details_yyh0901__lloma_step400/blob/main/results_2024-04-06T13-16-47.684336.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3102908056939462,\n\
\ \"acc_stderr\": 0.03251713048512004,\n \"acc_norm\": 0.31260404239670475,\n\
\ \"acc_norm_stderr\": 0.0333330841188841,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456421,\n \"mc2\": 0.3858138994902272,\n\
\ \"mc2_stderr\": 0.014262726110006094\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.36689419795221845,\n \"acc_stderr\": 0.014084133118104289,\n\
\ \"acc_norm\": 0.3984641638225256,\n \"acc_norm_stderr\": 0.014306946052735565\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4393547102170882,\n\
\ \"acc_stderr\": 0.00495294207299927,\n \"acc_norm\": 0.5946026687910775,\n\
\ \"acc_norm_stderr\": 0.004899653704032829\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \"acc_stderr\"\
: 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\"\
: 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \
\ \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310052,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310052\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32075471698113206,\n \"acc_stderr\": 0.028727502957880263,\n\
\ \"acc_norm\": 0.32075471698113206,\n \"acc_norm_stderr\": 0.028727502957880263\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.032424147574830996,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.032424147574830996\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.031068985963122145,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.031068985963122145\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03505859682597264,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03505859682597264\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746346,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746346\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3193548387096774,\n \"acc_stderr\": 0.026522709674667765,\n \"\
acc_norm\": 0.3193548387096774,\n \"acc_norm_stderr\": 0.026522709674667765\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.034107802518361846,\n\
\ \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.034107802518361846\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.02340092891831049,\n \
\ \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.02340092891831049\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28440366972477066,\n \"acc_stderr\": 0.019342036587702588,\n \"\
acc_norm\": 0.28440366972477066,\n \"acc_norm_stderr\": 0.019342036587702588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.350210970464135,\n \"acc_stderr\": 0.031052391937584353,\n \
\ \"acc_norm\": 0.350210970464135,\n \"acc_norm_stderr\": 0.031052391937584353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n\
\ \"acc_stderr\": 0.03298574607842821,\n \"acc_norm\": 0.4080717488789238,\n\
\ \"acc_norm_stderr\": 0.03298574607842821\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.31297709923664124,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.31297709923664124,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.03834241021419073,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.03834241021419073\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258975,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258975\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.34615384615384615,\n\
\ \"acc_stderr\": 0.031166957367235897,\n \"acc_norm\": 0.34615384615384615,\n\
\ \"acc_norm_stderr\": 0.031166957367235897\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4086845466155811,\n\
\ \"acc_stderr\": 0.017579250148153393,\n \"acc_norm\": 0.4086845466155811,\n\
\ \"acc_norm_stderr\": 0.017579250148153393\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577612,\n\
\ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.027184498909941613,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.027184498909941613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3536977491961415,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.3536977491961415,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890165,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890165\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.20212765957446807,\n \"acc_stderr\": 0.023956668237850233,\n \
\ \"acc_norm\": 0.20212765957446807,\n \"acc_norm_stderr\": 0.023956668237850233\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\
\ \"acc_stderr\": 0.01128503316555127,\n \"acc_norm\": 0.26597131681877445,\n\
\ \"acc_norm_stderr\": 0.01128503316555127\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815194,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2935323383084577,\n\
\ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.2935323383084577,\n\
\ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.038200425866029654,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.038200425866029654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456421,\n \"mc2\": 0.3858138994902272,\n\
\ \"mc2_stderr\": 0.014262726110006094\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6527229676400947,\n \"acc_stderr\": 0.01338090924975124\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \
\ \"acc_stderr\": 0.003681611894073874\n }\n}\n```"
repo_url: https://huggingface.co/yyh0901/lloma_step400
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|arc:challenge|25_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|gsm8k|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hellaswag|10_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-16-47.684336.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T13-16-47.684336.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- '**/details_harness|winogrande|5_2024-04-06T13-16-47.684336.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T13-16-47.684336.parquet'
- config_name: results
data_files:
- split: 2024_04_06T13_16_47.684336
path:
- results_2024-04-06T13-16-47.684336.parquet
- split: latest
path:
- results_2024-04-06T13-16-47.684336.parquet
---
# Dataset Card for Evaluation run of yyh0901/lloma_step400
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yyh0901/lloma_step400](https://huggingface.co/yyh0901/lloma_step400) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yyh0901__lloma_step400",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T13:16:47.684336](https://huggingface.co/datasets/open-llm-leaderboard/details_yyh0901__lloma_step400/blob/main/results_2024-04-06T13-16-47.684336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3102908056939462,
"acc_stderr": 0.03251713048512004,
"acc_norm": 0.31260404239670475,
"acc_norm_stderr": 0.0333330841188841,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456421,
"mc2": 0.3858138994902272,
"mc2_stderr": 0.014262726110006094
},
"harness|arc:challenge|25": {
"acc": 0.36689419795221845,
"acc_stderr": 0.014084133118104289,
"acc_norm": 0.3984641638225256,
"acc_norm_stderr": 0.014306946052735565
},
"harness|hellaswag|10": {
"acc": 0.4393547102170882,
"acc_stderr": 0.00495294207299927,
"acc_norm": 0.5946026687910775,
"acc_norm_stderr": 0.004899653704032829
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310052,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310052
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32075471698113206,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.32075471698113206,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.032424147574830996,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.032424147574830996
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03505859682597264,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03505859682597264
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746346,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746346
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3193548387096774,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.3193548387096774,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.034107802518361846,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.034107802518361846
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28440366972477066,
"acc_stderr": 0.019342036587702588,
"acc_norm": 0.28440366972477066,
"acc_norm_stderr": 0.019342036587702588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.350210970464135,
"acc_stderr": 0.031052391937584353,
"acc_norm": 0.350210970464135,
"acc_norm_stderr": 0.031052391937584353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842821,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842821
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.31297709923664124,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.31297709923664124,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419073,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419073
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258975,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.031166957367235897,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.031166957367235897
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4086845466155811,
"acc_stderr": 0.017579250148153393,
"acc_norm": 0.4086845466155811,
"acc_norm_stderr": 0.017579250148153393
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28034682080924855,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.28034682080924855,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3536977491961415,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.3536977491961415,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890165,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890165
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20212765957446807,
"acc_stderr": 0.023956668237850233,
"acc_norm": 0.20212765957446807,
"acc_norm_stderr": 0.023956668237850233
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.01128503316555127,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.01128503316555127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815194,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2935323383084577,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.2935323383084577,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.038200425866029654,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.038200425866029654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456421,
"mc2": 0.3858138994902272,
"mc2_stderr": 0.014262726110006094
},
"harness|winogrande|5": {
"acc": 0.6527229676400947,
"acc_stderr": 0.01338090924975124
},
"harness|gsm8k|5": {
"acc": 0.01819560272934041,
"acc_stderr": 0.003681611894073874
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta | ---
pretty_name: Evaluation run of OptimalScale/robin-13b-v2-delta
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OptimalScale/robin-13b-v2-delta](https://huggingface.co/OptimalScale/robin-13b-v2-delta)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-04T18:08:52.244101](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta/blob/main/results_2023-08-04T18%3A08%3A52.244101.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48671411429389705,\n\
\ \"acc_stderr\": 0.034851524265514446,\n \"acc_norm\": 0.49073578692938213,\n\
\ \"acc_norm_stderr\": 0.03483423136146648,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.5054136576088012,\n\
\ \"mc2_stderr\": 0.014772161409527505\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.537542662116041,\n \"acc_stderr\": 0.014570144495075581,\n\
\ \"acc_norm\": 0.5656996587030717,\n \"acc_norm_stderr\": 0.014484703048857364\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5944035052778331,\n\
\ \"acc_stderr\": 0.004900036261309047,\n \"acc_norm\": 0.8035251941844254,\n\
\ \"acc_norm_stderr\": 0.003965196368697847\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739435,\n\
\ \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739435\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.037786210790920545,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.037786210790920545\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113946,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113946\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.037694303145125674,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.037694303145125674\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5606060606060606,\n \"acc_stderr\": 0.03536085947529479,\n \"\
acc_norm\": 0.5606060606060606,\n \"acc_norm_stderr\": 0.03536085947529479\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6735751295336787,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.6735751295336787,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000756,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000756\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926762,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926762\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6458715596330276,\n \"acc_stderr\": 0.02050472901382912,\n \"\
acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.02050472901382912\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.03085199299325701,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.03085199299325701\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674118,\n \"\
acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674118\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.0332319730294294,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.0332319730294294\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.028286324075564397,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.028286324075564397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.016617501738763394,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.016617501738763394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4884393063583815,\n \"acc_stderr\": 0.02691189868637792,\n\
\ \"acc_norm\": 0.4884393063583815,\n \"acc_norm_stderr\": 0.02691189868637792\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.02834504586484061,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.02834504586484061\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542595,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542595\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347247,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347247\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n\
\ \"acc_stderr\": 0.012593959992906424,\n \"acc_norm\": 0.4172099087353325,\n\
\ \"acc_norm_stderr\": 0.012593959992906424\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.0303720158854282,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.0303720158854282\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626923,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626923\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.03368787466115459,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.03368787466115459\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.035087719298245626,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.035087719298245626\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.5054136576088012,\n\
\ \"mc2_stderr\": 0.014772161409527505\n }\n}\n```"
repo_url: https://huggingface.co/OptimalScale/robin-13b-v2-delta
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|arc:challenge|25_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hellaswag|10_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-04T18:08:52.244101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-04T18:08:52.244101.parquet'
- config_name: results
data_files:
- split: 2023_08_04T18_08_52.244101
path:
- results_2023-08-04T18:08:52.244101.parquet
- split: latest
path:
- results_2023-08-04T18:08:52.244101.parquet
---
# Dataset Card for Evaluation run of OptimalScale/robin-13b-v2-delta
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OptimalScale/robin-13b-v2-delta
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OptimalScale/robin-13b-v2-delta](https://huggingface.co/OptimalScale/robin-13b-v2-delta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-04T18:08:52.244101](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta/blob/main/results_2023-08-04T18%3A08%3A52.244101.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48671411429389705,
"acc_stderr": 0.034851524265514446,
"acc_norm": 0.49073578692938213,
"acc_norm_stderr": 0.03483423136146648,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.5054136576088012,
"mc2_stderr": 0.014772161409527505
},
"harness|arc:challenge|25": {
"acc": 0.537542662116041,
"acc_stderr": 0.014570144495075581,
"acc_norm": 0.5656996587030717,
"acc_norm_stderr": 0.014484703048857364
},
"harness|hellaswag|10": {
"acc": 0.5944035052778331,
"acc_stderr": 0.004900036261309047,
"acc_norm": 0.8035251941844254,
"acc_norm_stderr": 0.003965196368697847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.037786210790920545,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.037786210790920545
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113946,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113946
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.037694303145125674,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.037694303145125674
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5606060606060606,
"acc_stderr": 0.03536085947529479,
"acc_norm": 0.5606060606060606,
"acc_norm_stderr": 0.03536085947529479
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6735751295336787,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.6735751295336787,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926762,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926762
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.02050472901382912,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.02050472901382912
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.03085199299325701,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.03085199299325701
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674118,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674118
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955924,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955924
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.0332319730294294,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.0332319730294294
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564397,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763394,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4884393063583815,
"acc_stderr": 0.02691189868637792,
"acc_norm": 0.4884393063583815,
"acc_norm_stderr": 0.02691189868637792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.02834504586484061,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.02834504586484061
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542595,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542595
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347247,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347247
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906424,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906424
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.0303720158854282,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.0303720158854282
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626923,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.03368787466115459,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.03368787466115459
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.035087719298245626,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.035087719298245626
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.5054136576088012,
"mc2_stderr": 0.014772161409527505
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
magic1992/comfyUifile | ---
license: apache-2.0
---
|
awghuku/thai_ser | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Anger
'1': Frustration
'2': Happiness
'3': Neutral
'4': Sadness
splits:
- name: train
num_bytes: 2977334910.978
num_examples: 14231
download_size: 2883049328
dataset_size: 2977334910.978
---
# Dataset Card for "thai_ser"
[ORIGINAL DATASET HERE](https://github.com/vistec-AI/dataset-releases/releases/tag/v1)
AI Research Institute of Thailand (AIResearch), with the collaboration between Vidyasirimedhi Institute of Science and Technology (VISTEC) and Digital Economy Promotion Agency (depa), cooperating with Department of Computer Engineering - Faculty of Engineering and Department of Dramatic Arts - Faculty of Arts, Chulalongkorn University, publishes an open Thai speech emotion recognition dataset, with the sponsorship from Advanced Info Services Public Company Limited (AIS), namely THAI SER.
This dataset consists of 5 main emotions assigned to actors: Neutral, Anger, Happiness, Sadness, and Frustration. The recordings were 41 hours, 36 minutes long (27,854 utterances), and were performed by 200 professional actors (112 female, 88 male) and directed by students, former alumni, and professors from the Faculty of Arts, Chulalongkorn University.
The THAI SER contains 100 recordings and is separated into two main categories: Studio and Zoom. Studio recordings also consist of two studio environments: Studio A, a controlled studio room with soundproof walls, and Studio B, a normal room without soundproof or noise control. Thus the recording environment can be concluded as follows:
```
StudioA (noise controlled, soundproof wall)
└─ studio001
└─ studio002
...
└─ studio018
StudioB (Normal room without soundproof wall)
└─ studio019
└─ studio020
...
└─ studio080
Zoom (Recorded online via Zoom and Zencastr)
└─ zoom001
└─ zoom002
...
└─ zoom020
```
Each recording is separated into two sessions: Script Session and Improvisation Session.
To mapped each utterance to an emotion, we use majority voted of answer from 3-8 annotators which collected from crowdsourcing (wang.in.th).
Script session
In the script session, the actor was assigned three sentences:
```
sentence 1: พรุ่งนี้มันวันหยุดราชการนะรู้รึยัง หยุดยาวด้วย
(Do you know tomorrow is a public holiday and it's the long one.)
sentence 2: อ่านหนังสือพิมพ์วันนี้รึยัง รู้ไหมเรื่องนั้นกลายเป็นข่าวใหญ่ไปแล้ว
(Have you read today's newspaper, that story was the topliner.)
sentence 3: ก่อนหน้านี้ก็ยังเห็นทำตัวปกติดี ใครจะไปรู้หล่ะ ว่าเค้าคิดแบบนั้น
(He/She was acting normal recently, who would thought that he/she would think like that.)
```
The actor was asked to speak each sentence two times for each emotion with two emotional intensity levels (normal, strong), with an additional neutral expression.
Improvisation session
For the Improvisation session, two actors were asked to improvised according to provided emotion and scenario.
```
Scenarios Actor A Actor B
1 (Neutral) A hotel receptionist trying to explain and service the customer (Angry) A angry customer who dissatisfy the hotel services
2 (Happy) A person excitingly talking with B about his/her marriage plan (Happy) A person happily talking with A and help him/her plan his ceremony
3 (Sad) A patient feeling depressed (Neutral) A doctor attempting to talk with A neutrally
4 (Angry) A furious boss talking with the employee (Frustrated) A frustrated person attempting to argue with his/her boss
5 (Frustrated) A person frustratingly talk about another person's action (Sad) A person feeling guilty and sad about his/her action
6 (Happy) A happy hotel staffs (Happy) Happy customer
7 (Sad) A sad person who felt unsecured about the incoming marriage (Frustrated) A person who frustrated about another person's insecureness
8 (Frustrated) A frustrated patience (Neutral) A Doctor talking with the patience
9 (Neutral) A worker who assigned to tell his/her co-worker about the company's bad situation (Sad) An employee feeling sad after listenning
10 (Angry) A person raging about another person's behavior (Angry) A person who feels like being blamed by another person
11 (Frustrated) A director who unsatisfied co-worker (Frustrated) A frustrated person who try their best on the job
12 (Happy) A person who gets a new job or promotion (Sad) A person who desperate in his/her job
13 (Neutral) A patient inquire information (Happy) A happy doctor telling his/her patience more information
14 (Angry) A person who upset with his/her work (Neutral) A calm friend who listened to another person's problem
15 (Sad) A person sadly tell another person about a relationship (Angry) A person who feels angry after listening to another person's bad relationship
```
File naming convention
Each of files has a unique filename, provided in .flac format with sample rate about 44.1 KHz. The filename consists of a 5 to 6-part identifier (e.g., s002_clip_actor003_impro1_1.flac, s002_clip_actor003_script1_1_1a.flac). These identifiers define the stimulus characteristics:
File Directory Management
```
studio (e.g., studio1-10)
└─ <studio-num> (studio1, studio2, ...)
└─ <mic-type> (con, clip, middle)
└─<audio-file> (.flac)
zoom (e.g., zoom1-10)
└─ <zoom-num> (zoomo1, zoom2, ...)
└─ <mic-type> (mic)
└─ <audio-file> (.flac)
```
Filename identifiers
```
Recording ID (s = studio recording, z = zoom recording)
Number of recording (e.g., s001, z001)
Microphone type (clip, con, middle, mic)
Zoom recording session
mic = An actor's microphone-of-choice
studio recording session
con = Condenser microphone (Cardioid polar patterns) which was placed 0.5m from the actor setting
clip = Lavalier microphone (Omni-directional patterns) attached to the actor’s shirt collar
middle = Condenser microphone (Figure-8 polar patterns) which was placed between actors
Actor ID (actor001 to actor200: Odd-numbered actors are Actor A, even-numbered actors are Actor B in improvisation session).
Session ID (impro = Improvisation Session, script = Script Session)
Script Session (e.g., _script1_1_1a)
Sentence ID (script1-script3)
Repetition (1 = 1st repetition, 2 = 2nd repetition)
Emotion (1 = Neutral, 2 = Angry, 3 = Happy, 4 = Sad, 5 = Frustrated)
Emotional intensity (a = Normal, b = Strong)
Improvisation Session (e.g., _impro1_1)
Scenario ID (impro1-15)
Utterance no. (e.g., _impro1_1 , _impro1_2)
Filename example: s002_clip_actor003_impro1_1.flac
Studio recording number 2 (s002)
Recording by Lavalier microphone (clip)
3rd Actor (actor003)
Improvisation session, scenario 1 (impro1)
1st utterance of scenario recording (1)
Other Files
emotion_label.json - a dictionary for recording id, assigned emotion (assigned_emo), majority emotion (emotion_emo), annotated emotions from crowdsourcing (annotated), and majority agreement score (agreement)
actor_demography.json - a dictionary that contains information about the age and sex of actors.
```
Version
```
Version 1 (26 March 2021): Thai speech emotion recognition dataset THAI SER contains 100 recordings (80 studios and 20 zooms) which is 41 hours 36 minutes long which contain 27,854 utterances and be labeled 27,854 utterances.
Dataset statistics
Recording environment Session Number of utterances Duration(hrs)
Zoom (20) Script 2,398 4.0279
Improvisation 3,606 5.8860
Studio (80) Script 9,582 13.6903
Improvisation 12,268 18.0072
Total (100) 27,854 41.6114
```
Dataset sponsorship and license
Advanced Info Services Public Company Limited
This work is published under a Creative Commons BY-SA 4.0 |
Nerfgun3/hurybone_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/hurybone_style/resolve/main/hurybone_showcase.png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Hurybone Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/hurybone_style/resolve/main/hurybone_showcase.png"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"hurybone_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(hurybone_style:0.8)"```
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
zolak/twitter_dataset_50_1713064201 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2690570
num_examples: 6549
download_size: 1368772
dataset_size: 2690570
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
316usman/thematic4b | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 175160177
num_examples: 232037
download_size: 55079777
dataset_size: 175160177
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DEplain/DEplain-APA-doc | ---
annotations_creators:
- no-annotation
language:
- de
language_creators:
- expert-generated
license:
- other
multilinguality:
- translation
- monolingual
pretty_name: DEplain-APA-doc
size_categories:
- <1K
source_datasets:
- original
tags:
- web-text
- plain language
- easy-to-read language
- document simplification
task_categories:
- text2text-generation
task_ids:
- text-simplification
---
# DEplain-APA-doc: A corpus for German Document Simplification
DEplain-APA-doc is a subcorpus of DEplain [Stodden et al., 2023]((https://arxiv.org/abs/2305.18939)) for document simplification.
The corpus consists of 483 (387/48/48) parallel documents from the Austrian Press Agency (APA) in German written for people with CEFR level B1 (plain language) and for people with CEFR level A2 (plain language). All documents are either published under an open license or the copyright holders gave us the permission to share the data.
Human annotators also sentence-wise aligned the 483 documents to build a corpus for sentence simplification.
For the sentence-level version of this corpus, please see [https://huggingface.co/datasets/DEplain/DEplain-APA-sent](https://huggingface.co/datasets/DEplain/DEplain-APA-sent).
The data of APA (Austrian Press Agency) is restricted for non-commercial research purposes. To get access to DEplain-APA please request the access via zenodo (https://zenodo.org/record/7674560).
# Dataset Card for DEplain-APA-doc
### Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
### Dataset Description
- **Repository:** [DEplain-APA zenodo repository](https://zenodo.org/record/7674560)
- **Paper:** ["DEplain: A German Parallel Corpus with Intralingual Translations into Plain Language for Sentence and Document Simplification."](https://arxiv.org/abs/2305.18939)
- **Point of Contact:** [Regina Stodden](regina.stodden@hhu.de)
#### Dataset Summary
DEplain-APA [(Stodden et al., 2023)](https://arxiv.org/abs/2305.18939) is a dataset for the training and evaluation of sentence and document simplification in German. All texts of this dataset are provided by the Austrian Press Agency. The simple-complex sentence pairs are manually aligned.
#### Supported Tasks and Leaderboards
The dataset supports the training and evaluation of `text-simplification` systems. Success in this task is typically measured using the [SARI](https://huggingface.co/metrics/sari) and [FKBLEU](https://huggingface.co/metrics/fkbleu) metrics described in the paper [Optimizing Statistical Machine Translation for Text Simplification](https://www.aclweb.org/anthology/Q16-1029.pdf).
#### Languages
The text in this dataset is in Austrian German (`de-at`).
#### Domains
All texts in this dataset are news data.
## Dataset Structure
#### Data Access
- The dataset is licensed with restricted access for only academic purposes. To download the dataset, please request access on [zenodo](https://zenodo.org/record/7674560).
#### Data Instances
- `document-simplification` configuration: an instance consists of an original document and one reference simplification (in plain-text format).
- `sentence-simplification` configuration: an instance consists of original sentence(s) and one manually aligned reference simplification (inclusing one or more sentences).
#### Data Fields
| data field | data field description |
|-------------------------------------------------|-------------------------------------------------------------------------------------------------------|
| `original` | an original text from the source dataset |
| `simplification` | a simplified text from the source dataset |
| `pair_id` | document pair id |
| `complex_document_id ` (on doc-level) | id of complex document (-1) |
| `simple_document_id ` (on doc-level) | id of simple document (-0) |
| `original_id ` (on sent-level) | id of sentence(s) of the original text |
| `simplification_id ` (on sent-level) | id of sentence(s) of the simplified text |
| `domain ` | text domain of the document pair |
| `corpus ` | subcorpus name |
| `simple_url ` | origin URL of the simplified document |
| `complex_url ` | origin URL of the simplified document |
| `simple_level ` or `language_level_simple ` | required CEFR language level to understand the simplified document |
| `complex_level ` or `language_level_original ` | required CEFR language level to understand the original document |
| `simple_location_html ` | location on hard disk where the HTML file of the simple document is stored |
| `complex_location_html ` | location on hard disk where the HTML file of the original document is stored |
| `simple_location_txt ` | location on hard disk where the content extracted from the HTML file of the simple document is stored |
| `complex_location_txt ` | location on hard disk where the content extracted from the HTML file of the simple document is stored |
| `alignment_location ` | location on hard disk where the alignment is stored |
| `simple_author ` | author (or copyright owner) of the simplified document |
| `complex_author ` | author (or copyright owner) of the original document |
| `simple_title ` | title of the simplified document |
| `complex_title ` | title of the original document |
| `license ` | license of the data |
| `last_access ` or `access_date` | data origin data or data when the HTML files were downloaded |
| `rater` | id of the rater who annotated the sentence pair |
| `alignment` | type of alignment, e.g., 1:1, 1:n, n:1 or n:m |
#### Data Splits
DEplain-APA is randomly split into a training, development and test set. The training set of the sentence-simplification configuration contains only texts of documents which are part of the training set of document-simplification configuration and the same for dev and test sets.
The statistics are given below.
| | Train | Dev | Test | Total |
| ----- | ------ | ------ | ---- | ----- |
| Document Pairs | 387 | 48 | 48 |483 |
| Sentence Pairs | 10660 | 1231 | 1231 | 13122|
Inter-Annotator-Agreement: 0.7497 (moderate)
Here, more information on simplification operations will follow soon.
### Dataset Creation
#### Curation Rationale
DEplain-APA was created to improve the training and evaluation of German document and sentence simplification. The data is provided by the same data provided as for the APA-LHA data. In comparison to APA-LHA (automatic-aligned), the sentence pairs of DEplain-APA are all manually aligned. Further, DEplain-APA aligns the texts in language level B1 with the texts in A2, which result in mostly mild simplifications.
Further DEplain-APA, contains parallel documents as well as parallel sentence pairs.
#### Source Data
##### Initial Data Collection and Normalization
The original news texts (in CEFR level B2) were manually simplified by professional translators, i.e. capito – CFS GmbH, and provided to us by the Austrian Press Agency.
All documents date back to 2019 to 2021.
Two German native speakers have manually aligned the sentence pairs by using the text simplification annotation tool TS-ANNO. The data was split into sentences using a German model of SpaCy.
##### Who are the source language producers?
The original news texts (in CEFR level B2) were manually simplified by professional translators, i.e. capito – CFS GmbH. No other demographic or compensation information is known.
#### Annotations
##### Annotation process
The instructions given to the annotators are available [here](https://github.com/rstodden/TS_annotation_tool/tree/master/annotation_schema).
##### Who are the annotators?
The annotators are two German native speakers, who are trained in linguistics. Both were at least compensated with the minimum wage of their country of residence.
They are not part of any target group of text simplification.
#### Personal and Sensitive Information
No sensitive data.
### Considerations for Using the Data
#### Social Impact of Dataset
Many people do not understand texts due to their complexity. With automatic text simplification methods, the texts can be simplified for them. Our new training data can benefit in training a TS model.
#### Discussion of Biases
No bias is known.
#### Other Known Limitations
The dataset is provided for research purposes only. Please check the dataset license for additional information.
### Additional Information
#### Dataset Curators
Researchers at the Heinrich-Heine-University Düsseldorf, Germany, developed DEplain-APA. This research is part of the PhD-program `Online Participation` supported by the North Rhine-Westphalian (German) funding scheme `Forschungskolleg`.
#### Licensing Information
The dataset (DEplain-APA) is provided for research purposes only. Please request access using the following form: [https://zenodo.org/record/7674560](https://zenodo.org/record/7674560).
#### Citation Information
If you use part of this work, please cite our paper:
```
@inproceedings{stodden-etal-2023-deplain,
title = "{DE}-plain: A German Parallel Corpus with Intralingual Translations into Plain Language for Sentence and Document Simplification",
author = "Stodden, Regina and
Momen, Omar and
Kallmeyer, Laura",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
notes = "preprint: https://arxiv.org/abs/2305.18939",
}
```
This dataset card uses material written by [Juan Diego Rodriguez](https://github.com/juand-r) and [Yacine Jernite](https://github.com/yjernite).
|
open-llm-leaderboard/details_marcchew__test1 | ---
pretty_name: Evaluation run of marcchew/test1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/test1](https://huggingface.co/marcchew/test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__test1\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T19:35:56.043440](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__test1/blob/main/results_2023-12-03T19-35-56.043440.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/marcchew/test1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|arc:challenge|25_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T02_23_20.616276
path:
- '**/details_harness|drop|3_2023-10-13T02-23-20.616276.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T02-23-20.616276.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T02_23_20.616276
path:
- '**/details_harness|gsm8k|5_2023-10-13T02-23-20.616276.parquet'
- split: 2023_12_03T19_35_56.043440
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-35-56.043440.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-35-56.043440.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hellaswag|10_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T02_23_20.616276
path:
- '**/details_harness|winogrande|5_2023-10-13T02-23-20.616276.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T02-23-20.616276.parquet'
- config_name: results
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- results_2023-09-01T15:41:12.486637.parquet
- split: 2023_10_13T02_23_20.616276
path:
- results_2023-10-13T02-23-20.616276.parquet
- split: 2023_12_03T19_35_56.043440
path:
- results_2023-12-03T19-35-56.043440.parquet
- split: latest
path:
- results_2023-12-03T19-35-56.043440.parquet
---
# Dataset Card for Evaluation run of marcchew/test1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/test1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/test1](https://huggingface.co/marcchew/test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__test1",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T19:35:56.043440](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__test1/blob/main/results_2023-12-03T19-35-56.043440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
qgyd2021/e_commerce_customer_service | ---
task_categories:
- text-retrieval
- question-answering
language:
- en
tags:
- e-commerce
size_categories:
- 1M<n<10M
---
## 电商客户服务数据集
是从 (lightinthebox)[https://www.lightinthebox.com/] 网站收集的电商数据. 此数据可用于电商客服机器人的研究.
数据内容:
faq.json: 包含通用问题的问答对.
product.jsonl: 包含一些商品信息.
examples 中包含收集商品信息的爬虫代码.
python==3.8.10
|
Emm9625/0405-cnn_dailymail-3.0.0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: article
dtype: string
- name: highlights
dtype: string
- name: article_length
dtype: int64
- name: highlights_length
dtype: int64
- name: topic
dtype: string
- name: topic_score
dtype: float64
splits:
- name: train
num_bytes: 1267659395
num_examples: 287113
- name: test
num_bytes: 50209059
num_examples: 11490
- name: validation
num_bytes: 58060805
num_examples: 13368
download_size: 838330477
dataset_size: 1375929259
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
myrtotsok/clf-2 | ---
dataset_info:
features:
- name: request
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 118767
num_examples: 1120
- name: validation
num_bytes: 29932
num_examples: 280
download_size: 25669
dataset_size: 148699
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/anis_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of anis/アニス/阿妮斯/아니스 (Nikke: Goddess of Victory)
This is the dataset of anis/アニス/阿妮斯/아니스 (Nikke: Goddess of Victory), containing 500 images and their tags.
The core tags of this character are `breasts, brown_hair, short_hair, bangs, large_breasts, ahoge, eyewear_on_head, sunglasses, brown_eyes, hair_ornament, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 923.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anis_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 432.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anis_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1351 | 1021.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anis_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 774.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anis_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1351 | 1.57 GiB | [Download](https://huggingface.co/datasets/CyberHarem/anis_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/anis_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blush, cleavage, collarbone, holding_can, jacket, looking_at_viewer, navel, off_shoulder, simple_background, smile, solo, white_bikini, bare_shoulders, long_sleeves, necklace, side-tie_bikini_bottom, white_background, open_mouth, thighs, eyepatch_bikini |
| 1 | 10 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, necklace, solo, white_bikini, navel, bare_shoulders, collarbone, jacket, official_alternate_costume, smile, thighs, blush, outdoors, side-tie_bikini_bottom, holding_can, off_shoulder, day, sitting, blue_sky, open_mouth |
| 2 | 7 |  |  |  |  |  | 1girl, blue_sky, blush, cleavage, collarbone, day, looking_at_viewer, navel, open_jacket, outdoors, smile, solo, white_bikini, bare_shoulders, long_sleeves, necklace, open_mouth, thighs, ocean, off_shoulder, side-tie_bikini_bottom, holding_can, palm_tree, standing, stomach, beach, cowboy_shot, tinted_eyewear, upper_teeth_only |
| 3 | 5 |  |  |  |  |  | 1girl, beret, black_headwear, black_jacket, black_shorts, cleavage, fingerless_gloves, holding_gun, long_sleeves, looking_at_viewer, open_jacket, short_shorts, solo, ammunition_belt, black_gloves, black_thighhighs, grenade_launcher, grey_gloves, midriff, navel, open_mouth, thigh_strap, thighs, blush, crop_top, grey_shirt, :d, cat_hair_ornament, cowboy_shot, outdoors, salute, skindentation, standing, trigger_discipline |
| 4 | 8 |  |  |  |  |  | 1girl, beret, black_headwear, black_jacket, black_shorts, black_thighhighs, fingerless_gloves, looking_at_viewer, open_jacket, short_shorts, solo, cleavage, long_sleeves, open_mouth, thigh_strap, ammunition_belt, collarbone, grey_shirt, black_gloves, grey_gloves, thighs, :d, salute, skindentation, blush, cowboy_shot, crop_top, midriff, simple_background |
| 5 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, navel, nipples, sex, solo_focus, vaginal, open_mouth, penis, pussy, thighhighs, beret, black_headwear, smile, spread_legs, cowgirl_position, girl_on_top, black_jacket, long_sleeves, mosaic_censoring, open_clothes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | cleavage | collarbone | holding_can | jacket | looking_at_viewer | navel | off_shoulder | simple_background | smile | solo | white_bikini | bare_shoulders | long_sleeves | necklace | side-tie_bikini_bottom | white_background | open_mouth | thighs | eyepatch_bikini | official_alternate_costume | outdoors | day | sitting | blue_sky | open_jacket | ocean | palm_tree | standing | stomach | beach | cowboy_shot | tinted_eyewear | upper_teeth_only | beret | black_headwear | black_jacket | black_shorts | fingerless_gloves | holding_gun | short_shorts | ammunition_belt | black_gloves | black_thighhighs | grenade_launcher | grey_gloves | midriff | thigh_strap | crop_top | grey_shirt | :d | cat_hair_ornament | salute | skindentation | trigger_discipline | 1boy | hetero | nipples | sex | solo_focus | vaginal | penis | pussy | thighhighs | spread_legs | cowgirl_position | girl_on_top | mosaic_censoring | open_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------|:-------------|:--------------|:---------|:--------------------|:--------|:---------------|:--------------------|:--------|:-------|:---------------|:-----------------|:---------------|:-----------|:-------------------------|:-------------------|:-------------|:---------|:------------------|:-----------------------------|:-----------|:------|:----------|:-----------|:--------------|:--------|:------------|:-----------|:----------|:--------|:--------------|:-----------------|:-------------------|:--------|:-----------------|:---------------|:---------------|:--------------------|:--------------|:---------------|:------------------|:---------------|:-------------------|:-------------------|:--------------|:----------|:--------------|:-----------|:-------------|:-----|:--------------------|:---------|:----------------|:---------------------|:-------|:---------|:----------|:------|:-------------|:----------|:--------|:--------|:-------------|:--------------|:-------------------|:--------------|:-------------------|:---------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | X | X | X | X | X | X | X | | X | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | | | X | X | | | | X | | | X | | | | X | X | | | X | | | | X | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | | | X | | | X | | X | | | X | | | | X | X | | | | | | | X | | | | | | X | | | X | X | X | X | X | | X | X | X | X | | X | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | | | | | | X | | | X | | | | X | | | | X | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Multimodal-Fatima/OxfordPets_copy_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': abyssinian
'1': american bulldog
'2': american pit bull terrier
'3': basset hound
'4': beagle
'5': bengal
'6': birman
'7': bombay
'8': boxer
'9': british shorthair
'10': chihuahua
'11': egyptian mau
'12': english cocker spaniel
'13': english setter
'14': german shorthaired
'15': great pyrenees
'16': havanese
'17': japanese chin
'18': keeshond
'19': leonberger
'20': maine coon
'21': miniature pinscher
'22': newfoundland
'23': persian
'24': pomeranian
'25': pug
'26': ragdoll
'27': russian blue
'28': saint bernard
'29': samoyed
'30': scottish terrier
'31': shiba inu
'32': siamese
'33': sphynx
'34': staffordshire bull terrier
'35': wheaten terrier
'36': yorkshire terrier
- name: species
dtype:
class_label:
names:
'0': Cat
'1': Dog
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_ViT_L_14
sequence: string
- name: clip_tag_ViT_L_14_specific
dtype: string
- name: clip_tags_ViT_L_14_ensemble_specific
dtype: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: clip_tags_ViT_L_14_with_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_wo_openai_classes
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003_oxfordpets
sequence: string
- name: clip_tags_ViT_B_16_simple_specific
dtype: string
- name: clip_tags_ViT_B_16_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_32_simple_specific
dtype: string
- name: clip_tags_ViT_B_32_ensemble_specific
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full_validate
sequence: string
- name: Attributes_ViT_B_16_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
- name: blip_caption_beam_5_Salesforce_blip2_opt_6.7b
dtype: string
splits:
- name: test
num_bytes: 7518510.0
num_examples: 100
download_size: 7289872
dataset_size: 7518510.0
---
# Dataset Card for "OxfordPets_copy_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_2.7b_mode_VQAv2_visclues_detection_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_8
num_bytes: 2601882
num_examples: 100
download_size: 525348
dataset_size: 2601882
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_2.7b_mode_VQAv2_visclues_detection_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wmt/wmt20_mlqe_task3 | ---
annotations_creators:
- expert-generated
- machine-generated
language_creators:
- found
language:
- en
- fr
license:
- unknown
multilinguality:
- translation
size_categories:
- 1K<n<10K
source_datasets:
- extended|amazon_us_reviews
task_categories:
- translation
task_ids: []
pretty_name: WMT20 - MultiLingual Quality Estimation (MLQE) Task3
dataset_info:
config_name: plain_text
features:
- name: document_id
dtype: string
- name: source_segments
sequence: string
- name: source_tokenized
sequence: string
- name: mt_segments
sequence: string
- name: mt_tokenized
sequence: string
- name: annotations
sequence:
- name: segment_id
sequence: int32
- name: annotation_start
sequence: int32
- name: annotation_length
sequence: int32
- name: severity
dtype:
class_label:
names:
'0': minor
'1': major
'2': critical
- name: severity_weight
dtype: float32
- name: category
dtype:
class_label:
names:
'0': Addition
'1': Agreement
'2': Ambiguous Translation
'3': Capitalization
'4': Character Encoding
'5': Company Terminology
'6': Date/Time
'7': Diacritics
'8': Duplication
'9': False Friend
'10': Grammatical Register
'11': Hyphenation
'12': Inconsistency
'13': Lexical Register
'14': Lexical Selection
'15': Named Entity
'16': Number
'17': Omitted Auxiliary Verb
'18': Omitted Conjunction
'19': Omitted Determiner
'20': Omitted Preposition
'21': Omitted Pronoun
'22': Orthography
'23': Other POS Omitted
'24': Over-translation
'25': Overly Literal
'26': POS
'27': Punctuation
'28': Shouldn't Have Been Translated
'29': Shouldn't have been translated
'30': Spelling
'31': Tense/Mood/Aspect
'32': Under-translation
'33': Unidiomatic
'34': Unintelligible
'35': Unit Conversion
'36': Untranslated
'37': Whitespace
'38': Word Order
'39': Wrong Auxiliary Verb
'40': Wrong Conjunction
'41': Wrong Determiner
'42': Wrong Language Variety
'43': Wrong Preposition
'44': Wrong Pronoun
- name: token_annotations
sequence:
- name: segment_id
sequence: int32
- name: first_token
sequence: int32
- name: last_token
sequence: int32
- name: token_after_gap
sequence: int32
- name: severity
dtype:
class_label:
names:
'0': minor
'1': major
'2': critical
- name: category
dtype:
class_label:
names:
'0': Addition
'1': Agreement
'2': Ambiguous Translation
'3': Capitalization
'4': Character Encoding
'5': Company Terminology
'6': Date/Time
'7': Diacritics
'8': Duplication
'9': False Friend
'10': Grammatical Register
'11': Hyphenation
'12': Inconsistency
'13': Lexical Register
'14': Lexical Selection
'15': Named Entity
'16': Number
'17': Omitted Auxiliary Verb
'18': Omitted Conjunction
'19': Omitted Determiner
'20': Omitted Preposition
'21': Omitted Pronoun
'22': Orthography
'23': Other POS Omitted
'24': Over-translation
'25': Overly Literal
'26': POS
'27': Punctuation
'28': Shouldn't Have Been Translated
'29': Shouldn't have been translated
'30': Spelling
'31': Tense/Mood/Aspect
'32': Under-translation
'33': Unidiomatic
'34': Unintelligible
'35': Unit Conversion
'36': Untranslated
'37': Whitespace
'38': Word Order
'39': Wrong Auxiliary Verb
'40': Wrong Conjunction
'41': Wrong Determiner
'42': Wrong Language Variety
'43': Wrong Preposition
'44': Wrong Pronoun
- name: token_index
sequence:
sequence:
sequence: int32
- name: total_words
dtype: int32
splits:
- name: train
num_bytes: 10762231
num_examples: 1448
- name: test
num_bytes: 743088
num_examples: 180
- name: validation
num_bytes: 1646472
num_examples: 200
download_size: 4660293
dataset_size: 13151791
configs:
- config_name: plain_text
data_files:
- split: train
path: plain_text/train-*
- split: test
path: plain_text/test-*
- split: validation
path: plain_text/validation-*
default: true
---
# Dataset Card for WMT20 - MultiLingual Quality Estimation (MLQE) Task3
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [WMT20 Quality Estimation Shared Task](http://www.statmt.org/wmt20/quality-estimation-task.html)
- **Repository**: [Github repository](https://github.com/deep-spin/deep-spin.github.io/tree/master/docs/data/wmt2020_qe)
- **Paper:** *Not available*
### Dataset Summary
From the homepage:
*This shared task (part of WMT20) will build on its previous editions to further examine automatic methods for estimating the quality of neural machine translation output at run-time, without relying on reference translations. As in previous years, we cover estimation at various levels. Important elements introduced this year include: a new task where sentences are annotated with Direct Assessment (DA) scores instead of labels based on post-editing; a new multilingual sentence-level dataset mainly from Wikipedia articles, where the source articles can be retrieved for document-wide context; the availability of NMT models to explore system-internal information for the task.*
*The goal of this task 3 is to predict document-level quality scores as well as fine-grained annotations.*
*Each document has a product title and its description, and is annotated for translation errors according to the MQM framework. Each error annotation has:*
- ***Word span(s).*** *Errors may consist of one or more words, not necessarily contiguous.*
- ***Severity.*** *An error can be minor (if it doesn't lead to a loss of meaning and it doesn't confuse or mislead the user), major (if it changes the meaning) or critical (if it changes the meaning and carry any type of implication, or could be seen as offensive).*
- ***Type.*** *A label specifying the error type, such as wrong word order, missing words, agreement, etc. They may provide additional information, but systems don't need to predict them.*
### Supported Tasks and Leaderboards
From the homepage:
*Submissions will be evaluated as in Task 1, in terms of Pearson's correlation between the true and predicted MQM document-level scores. Additionally, the predicted annotations will be evaluated in terms of their F1 scores with respect to the gold annotations. The [official evaluation scripts](https://github.com/sheffieldnlp/qe-eval-scripts) are available.*
### Languages
There is a single language pair in the dataset: English (`en`) - French (`fr`).
## Dataset Structure
### Data Instances
An example looks like this:
```
{
'document_id': 'B0000568SY',
'source_segments': ['Razor Scooter Replacement Wheels Set with Bearings', 'Scooter Wheels w/Bearings-Blue'],
'source_tokenized': ['Razor Scooter Replacement Wheels Set with Bearings', 'Scooter Wheels w / Bearings-Blue'],
'mt_segments': ['Roues de rechange Razor Scooter sertie de roulements', 'Roues de scooter w/roulements-bleu'],
'mt_tokenized': ['Roues de rechange Razor Scooter sertie de roulements', 'Roues de scooter w / roulements-bleu'],
'annotations': {
'segment_id': [[0], [1], [1], [0, 0], [0], [1], [1]],
'annotation_start': [[42], [19], [9], [0, 32], [9], [17], [30]],
'annotation_length': [[10], [10], [7], [5, 6], [8], [1], [4]],
'severity': [0, 0, 0, 0, 0, 1, 0],
'severity_weight': [1.0, 1.0, 1.0, 1.0, 1.0, 5.0, 1.0]
'category': [3, 3, 3, 1, 3, 36, 3],
},
'token_annotations': {
'category': [3, 3, 3, 1, 3, 36, 3],
'first_token': [[7], [5], [2], [0, 5], [2], [3], [5]],
'last_token': [[7], [5], [2], [0, 5], [2], [3], [5]],
'segment_id': [[0], [1], [1], [0, 0], [0], [1], [1]],
'severity': [0, 0, 0, 0, 0, 1, 0],
'token_after_gap': [[-1], [-1], [-1], [-1, -1], [-1], [-1], [-1]]
},
'token_index': [[[0, 5], [6, 2], [9, 8], [18, 5], [24, 7], [32, 6], [39, 2], [42, 10]], [[0, 5], [6, 2], [9, 7], [17, 1], [18, 1], [19, 15]]],
'total_words': 16
}
```
### Data Fields
- `document_id`: the document id (name of the folder).
- `source_segments`: the original source text, one sentence per line (i.e. per element of the list).
- `source_tokenized`: a tokenized version of `source_segments`.
- `mt_segments`: the original machine-translated text, one sentence per line (i.e. per element of the list).
- `mt_tokenized`: a tokenized version of `mt_segments`. Default value is `[]` when this information is not available (it happens 3 times in the train set: `B0001BW0PQ`, `B0001GS19U` and `B000A6SMJ0`).
- `annotations`: error annotations for the document. Each item of the list corresponds to an error annotation, which in turn may contain one or more error spans. Error fields are encoded in a dictionary. In the case of a multi-span error, multiple starting positions and lengths are encoded in the list. Note that these positions points to `mt.segments`, not `mt_tokenized`.
- `segment_id`: List of list of integers. Id of each error.
- `annotation_start`: List of list of integers. Start of each error.
- `annotation_length`: List of list of intergers. Length of each error.
- `severity`: List of one hot. Severity category of each error.
- `severity_weight`: List of floats. Severity weight of each error.
- `category`: List of one hot. Category of each error. See the 45 categories in `_ANNOTATION_CATEGORIES_MAPPING`.
- `token_annotations`: tokenized version of `annotations`. Each error span that contains one or more tokens has a "first token" and "last token". Again, multi-span errors have their first and last tokens encoded in a list. When a span is over a gap between two tokens, the "first" and "last" positions are `-1` (encoded as `-` in the original data), and instead the `token_after_gap` column points to the token immediately after the gap. In case of a gap occurring at the end of the sentence, this value will be equal to the number of tokens.
- `segment_id`: List of list of integers. Id of each error.
- `first_token`: List of list of integers. Start of each error.
- `last_token`: List of list of intergers. End of each error.
- `token_after_gap`: List of list of integers. Token after gap of each error.
- `severity`: List of one hot. Severity category of each error.
- `category`: List of one hot. Category of each error. See the 45 categories in `_ANNOTATION_CATEGORIES_MAPPING`.
- `token_index`: a mapping of tokens to their start and ending positions in `mt_segments`. For each token, a start and end value are encoded in a list of length 2, and all tokens represent one item in the list.
- `total_words`: total number of words in the document
```
_ANNOTATION_CATEGORIES_MAPPING = {
0: 'Addition',
1: 'Agreement',
2: 'Ambiguous Translation',
3: 'Capitalization',
4: 'Character Encoding',
5: 'Company Terminology',
6: 'Date/Time',
7: 'Diacritics',
8: 'Duplication',
9: 'False Friend',
10: 'Grammatical Register',
11: 'Hyphenation',
12: 'Inconsistency',
13: 'Lexical Register',
14: 'Lexical Selection',
15: 'Named Entity',
16: 'Number',
17: 'Omitted Auxiliary Verb',
18: 'Omitted Conjunction',
19: 'Omitted Determiner',
20: 'Omitted Preposition',
21: 'Omitted Pronoun',
22: 'Orthography',
23: 'Other POS Omitted',
24: 'Over-translation',
25: 'Overly Literal',
26: 'POS',
27: 'Punctuation',
28: "Shouldn't Have Been Translated",
29: "Shouldn't have been translated",
30: 'Spelling',
31: 'Tense/Mood/Aspect',
32: 'Under-translation',
33: 'Unidiomatic',
34: 'Unintelligible',
35: 'Unit Conversion',
36: 'Untranslated',
37: 'Whitespace',
38: 'Word Order',
39: 'Wrong Auxiliary Verb',
40: 'Wrong Conjunction',
41: 'Wrong Determiner',
42: 'Wrong Language Variety',
43: 'Wrong Preposition',
44: 'Wrong Pronoun'
}
```
### Data Splits
The dataset contains 1,448 documents for training, 200 documents for validation and 180 for (blind) test (all English-French).
## Dataset Creation
### Curation Rationale
The data is dervied from the [Amazon Product Reviews dataset](http://jmcauley.ucsd.edu/data/amazon/).
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Unknown
### Citation Information
```
Not available.
```
### Contributions
Thanks to [@VictorSanh](https://github.com/VictorSanh) for adding this dataset. |
orai-nlp/basqueGLUE | ---
language:
- eu
pretty_name: BasqueGLUE
size_categories:
- 100K<n<1M
---
# Dataset Card for BasqueGLUE
## Table of Contents
* [Table of Contents](#table-of-contents)
* [Dataset Description](#dataset-description)
* [Dataset Summary](#dataset-summary)
* [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
* [Languages](#languages)
* [Dataset Structure](#dataset-structure)
* [Data Instances](#data-instances)
* [Data Fields](#data-fields)
* [Data Splits](#data-splits)
* [Dataset Creation](#dataset-creation)
* [Curation Rationale](#curation-rationale)
* [Additional Information](#additional-information)
* [Dataset Curators](#dataset-curators)
* [Licensing Information](#licensing-information)
* [Citation Information](#citation-information)
* [Contributions](#contributions)
## Dataset Description
* **Repository:** <https://github.com/orai-nlp/BasqueGLUE>
* **Paper:** [BasqueGLUE: A Natural Language Understanding Benchmark for Basque](http://www.lrec-conf.org/proceedings/lrec2022/pdf/2022.lrec-1.172.pdf)
* **Point of Contact:** [Contact Information](https://github.com/orai-nlp/BasqueGLUE#contact-information)
### Dataset Summary
Natural Language Understanding (NLU) technology has improved significantly over the last few years, and multitask benchmarks such as GLUE are key to evaluate this improvement in a robust and general way. These benchmarks take into account a wide and diverse set of NLU tasks that require some form of language understanding, beyond the detection of superficial, textual clues. However, they are costly to develop and language-dependent, and therefore they are only available for a small number of languages.
We present BasqueGLUE, the first NLU benchmark for Basque, which has been elaborated from previously existing datasets and following similar criteria to those used for the construction of GLUE and SuperGLUE. BasqueGLUE is freely available under an open license.
| Dataset | \|Train\| | \|Val\| | \|Test\| | Task | Metric | Domain |
|----------------|----------:|--------:|---------:|------------------------|:------:|-----------------|
| NERCid | 51,539 | 12,936 | 35,855 | NERC | F1 | News |
| NERCood | 64,475 | 14,945 | 14,462 | NERC | F1 | News, Wikipedia |
| FMTODeu_intent | 3,418 | 1,904 | 1,087 | Intent classification | F1 | Dialog system |
| FMTODeu_slot | 19,652 | 10,791 | 5,633 | Slot filling | F1 | Dialog system |
| BHTCv2 | 8,585 | 1,857 | 1,854 | Topic classification | F1 | News |
| BEC2016eu | 6,078 | 1,302 | 1,302 | Sentiment analysis | F1 | Twitter |
| VaxxStance | 864 | 206 | 312 | Stance detection | MF1* | Twitter |
| QNLIeu | 1,764 | 230 | 238 | QA/NLI | Acc | Wikipedia |
| WiCeu | 408,559 | 600 | 1,400 | WSD | Acc | Wordnet |
| EpecKorrefBin | 986 | 320 | 587 | Coreference resolution | Acc | News |
### Supported Tasks and Leaderboards
This benchmark comprises the following tasks:
#### NERCid
This dataset contains sentences from the news domain with manually annotated named entities. The data is the merge of EIEC (a dataset of a collection of news wire articles from Euskaldunon Egunkaria newspaper, (Alegria et al. 2004)), and newly annotated data from naiz.eus. The data is annotated following the BIO annotation scheme over four categories: person, organization, location, and miscellaneous.
#### NERCood
This dataset contains sentences with manually annotated named entities. The training data is the merge of EIEC (a dataset of a collection of news wire articles from Euskaldunon Egunkaria newspaper, (Alegria et al. 2004)), and newly annotated data from naiz.eus. The data is annotated following the BIO annotation scheme over four categories: person, organization, location, and miscellaneous. For validation and test sets, sentences from Wikipedia were annotated following the same annotation guidelines.
#### FMTODeu_intent
This dataset contains utterance texts and intent annotations drawn from the manually-annotated Facebook Multilingual Task Oriented Dataset (FMTOD) (Schuster et al. 2019). Basque translated data was drawn from the datasets created for Building a Task-oriented Dialog System for languages with no training data: the Case for Basque (de Lacalle et al. 2020). The examples are annotated with one of 12 different intent classes corresponding to alarm, reminder or weather related actions.
#### FMTODeu_slot
This dataset contains utterance texts and sequence intent argument annotations designed for slot filling tasks, drawn from the manually-annotated Facebook Multilingual Task Oriented Dataset (FMTOD) (Schuster et al. 2019). Basque translated data was drawn from the datasets created for Building a Task-oriented Dialog System for languages with no training data: the Case for Basque (de Lacalle et al. 2020). The task is a sequence labelling task similar to NERC, following BIO annotation scheme over 11 categories.
#### BHTCv2
The corpus contains 12,296 news headlines (brief article descriptions) from the Basque weekly newspaper [Argia](https://www.argia.eus). Topics are classified uniquely according to twelve thematic categories.
#### BEC2016eu
The Basque Election Campaign 2016 Opinion Dataset (BEC2016eu) is a new dataset for the task of sentiment analysis, a sequence classification task, which contains tweets about the campaign for the Basque elections from 2016. The crawling was carried out during the election campaign period (2016/09/09-2016/09/23), by monitoring the main parties and their respective candidates. The tweets were manually annotated as positive, negative or neutral.
#### VaxxStance
The VaxxStance (Agerri et al., 2021) dataset originally provides texts and stance annotations for social media texts around the anti-vaccine movement. Texts are given a label indicating whether they express an AGAINST, FAVOR or NEUTRAL stance towards the topic.
#### QNLIeu
This task includes the QA dataset ElkarHizketak (Otegi et al. 2020), a low resource conversational Question Answering (QA) dataset for Basque created by native speaker volunteers. The dataset is built on top of Wikipedia sections about popular people and organizations, and it contains around 400 dialogues and 1600 question and answer pairs. The task was adapted into a sentence-pair binary classification task, following the design of QNLI for English (Wang et al. 2019). Each question and answer pair are given a label indicating whether the answer is entailed by the question.
#### WiCeu
Word in Context or WiC (Pilehvar and Camacho-Collados 2019) is a word sense disambiguation (WSD) task, designed as a particular form of sentence pair binary classification. Given two text snippets and a polyse mous word that appears in both of them (the span of the word is marked in both snippets), the task is to determine whether the word has the same sense in both sentences. This dataset is based on the EPEC-EuSemcor (Pociello et al. 2011) sense-tagged corpus.
#### EpecKorrefBin
EPEC-KORREF-Bin is a dataset derived from EPEC-KORREF (Soraluze et al. 2012), a corpus of Basque news documents with manually annotated mentions and coreference chains, which we have been converted into a binary classification task. In this task, the model has to predict whether two mentions from a text, which can be pronouns, nouns or noun phrases, are referring to the same entity.
#### Leaderboard
Results obtained for two BERT base models as a baseline for the Benchmark.
| | AVG | NERC | F_intent | F_slot | BHTC | BEC | Vaxx | QNLI | WiC | coref |
|------------------------------------------------------------|:-----:|:-----:|:---------:|:-------:|:-----:|:-----:|:-----:|:-----:|:-----:|:-----:|
| Model | | F1 | F1 | F1 | F1 | F1 | MF1 | acc | acc | acc |
|[BERTeus](https://huggingface.co/ixa-ehu/berteus-base-cased)| 73.23 | 81.92 | 82.52 | 74.34 | 78.26 | 69.43 | 59.30 | 74.26 | 70.71 | 68.31 |
|[ElhBERTeu](https://huggingface.co/elh-eus/ElhBERTeu) | 73.71 | 82.30 | 82.24 | 75.64 | 78.05 | 69.89 | 63.81 | 73.84 | 71.71 | 65.93 |
The results obtained on NERC are the average of in-domain and out-of-domain NERC.
### Languages
Data are available in Basque (BCP-47 `eu`)
## Dataset Structure
### Data Instances
#### NERCid/NERCood
An example of 'train' looks as follows:
```
{
"idx": 0,
"tags": ["O", "O", "O", "O", "B-ORG", "O", ...],
"tokens": ["Greba", "orokorrera", "deitu", "du", "EHk", "27rako", ...]
}
```
#### FMTODeu_intent
An example of 'train' looks as follows:
```
{
"idx": 0,
"label": "alarm/modify_alarm",
"text": "aldatu alarma 7am-tik 7pm-ra , mesedez"
}
```
#### FMTODeu_slot
An example of 'train' looks as follows:
```
{
"idx": 923,
"tags": ["O", "B-reminder/todo", "I-datetime", "I-datetime", "B-reminder/todo"],
"tokens": ["gogoratu", "zaborra", "gaur", "gauean", "ateratzea"]
}
```
#### BHTCv2
An example of 'test' looks as follows:
```
{
"idx": 0,
"label": "Gizartea",
"text": "Genero berdintasunaz, hezkuntzaz eta klase gizarteaz hamar liburu baino gehiago..."
}
```
#### BEC2016eu
An example of 'test' looks as follows:
```
{
"idx": 0,
"label": "NEU",
"text": '"Emandako hitza bete egingo dut" Urkullu\nBa galdeketa enegarrenez daramazue programan (ta zuen AHTa...)\n#I25debatea #URL"'
}
```
#### VaxxStance
An example of 'train' looks as follows:
```
{
"idx": 0,
"label": "FAVOR",
"text": "\"#COVID19 Oraingo datuak, izurriaren dinamika, txertoaren eragina eta birusaren..
}
```
#### QNLIeu
An example of 'train' looks as follows:
```
{
"idx": 1,
"label": "not_entailment",
"question": "Zein posiziotan jokatzen du Busquets-ek?",
"sentence": "Busquets 23 partidatan izan zen konbokatua eta 2 gol sartu zituen."
}
```
#### WiCeu
An example of 'test' looks as follows:
```
{
"idx": 16,
"label": false,
"word": "udal",
"sentence1": "1a . Lekeitioko udal mugarteko Alde Historikoa Birgaitzeko Plan Berezia behin...",
"sentence2": "Diezek kritikatu egin zuen EAJk zenbait udaletan EH gobernu taldeetatik at utzi...",
"start1": 16,
"start2": 40,
"end1": 21,
"end2": 49
}
```
#### EpecKorrefBin
An example of 'train' looks as follows:
```
{
"idx": 6,
"label": false,
"text": "Isuntza da faborito nagusia Elantxobeko banderan . ISUNTZA trainerua da faborito nagusia bihar Elantxoben jokatuko den bandera irabazteko .",
"span1_text": "Elantxobeko banderan",
"span2_text": "ISUNTZA trainerua",
"span1_index": 4,
"span2_index": 8
}
```
### Data Fields
#### NERCid
* `tokens`: a list of `string` features
* `tags`: a list of entity labels, with possible values including `person` (PER), `location` (LOC), `organization` (ORG), `miscellaneous` (MISC)
* `idx`: an `int32` feature
#### NERCood
* `tokens`: a list of `string` features
* `tags`: a list of entity labels, with possible values including `person` (PER), `location` (LOC), `organization` (ORG), `miscellaneous` (MISC)
* `idx`: an `int32` feature
#### FMTODeu_intent
* `text`: a `string` feature
* `label`: an intent label, with possible values including:
* `alarm/cancel_alarm`
* `alarm/modify_alarm`
* `alarm/set_alarm`
* `alarm/show_alarms`
* `alarm/snooze_alarm`
* `alarm/time_left_on_alarm`
* `reminder/cancel_reminder`
* `reminder/set_reminder`
* `reminder/show_reminders`
* `weather/checkSunrise`
* `weather/checkSunset`
* `weather/find`
* `idx`: an `int32` feature
#### FMTODeu_slot
* `tokens`: a list of `string` features
* `tags`: a list of intent labels, with possible values including:
* `datetime`
* `location`
* `negation`
* `alarm/alarm_modifier`
* `alarm/recurring_period`
* `reminder/noun`
* `reminder/todo`
* `reminder/reference`
* `reminder/recurring_period`
* `weather/attribute`
* `weather/noun`
* `idx`: an `int32` feature
#### BHTCv2
* `text`: a `string` feature
* `label`: a polarity label, with possible values including `neutral` (NEU), `negative` (N), `positive` (P)
* `idx`: an `int32` feature
#### BEC2016eu
* `text`: a `string` feature
* `label`: a topic label, with possible values including:
* `Ekonomia`
* `Euskal Herria`
* `Euskara`
* `Gizartea`
* `Historia`
* `Ingurumena`
* `Iritzia`
* `Komunikazioa`
* `Kultura`
* `Nazioartea`
* `Politika`
* `Zientzia`
* `idx`: an `int32` feature
#### VaxxStance
* `text`: a `string` feature
* `label`: a stance label, with possible values including `AGAINST`, `FAVOR`, `NONE`
* `idx`: an `int32` feature
#### QNLIeu
* `question`: a `string` feature
* `sentence`: a `string` feature
* `label`: an entailment label, with possible values including `entailment`, `not_entailment`
* `idx`: an `int32` feature
#### WiCeu
* `word`: a `string` feature
* `sentence1`: a `string` feature
* `sentence2`: a `string` feature
* `label`: a `boolean` label indicating sense agreement, with possible values including `true`, `false`
* `start1`: an `int` feature indicating character position where word occurence begins in first sentence
* `start2`: an `int` feature indicating character position where word occurence begins in second sentence
* `end1`: an `int` feature indicating character position where word occurence ends in first sentence
* `end2`: an `int` feature indicating character position where word occurence ends in second sentence
* `idx`: an `int32` feature
#### EpecKorrefBin
* `text`: a `string` feature.
* `label`: a `boolean` coreference label, with possible values including `true`, `false`.
* `span1_text`: a `string` feature
* `span2_text`: a `string` feature
* `span1_index`: an `int` feature indicating token index where `span1_text` feature occurs in `text`
* `span2_index`: an `int` feature indicating token index where `span2_text` feature occurs in `text`
* `idx`: an `int32` feature
### Data Splits
| Dataset | \|Train\| | \|Val\| | \|Test\| |
|---------|--------:|------:|-------:|
| NERCid | 51,539 | 12,936 | 35,855 |
| NERCood | 64,475 | 14,945 | 14,462 |
| FMTODeu_intent | 3,418 | 1,904 | 1,087 |
| FMTODeu_slot | 19,652 | 10,791 | 5,633 |
| BHTCv2 | 8,585 | 1,857 | 1,854 |
| BEC2016eu | 6,078 | 1,302 | 1,302 |
| VaxxStance | 864 | 206 | 312 |
| QNLIeu | 1,764 | 230 | 238 |
| WiCeu | 408,559 | 600 | 1,400 |
| EpecKorrefBin | 986 | 320 | 587 |
## Dataset Creation
### Curation Rationale
We believe that BasqueGLUE is a significant contribution towards developing NLU tools in Basque, which we believe will facilitate the technological advance for the Basque language. In order to create BasqueGLUE we took as a reference the GLUE and SuperGLUE frameworks. When possible, we re-used existing datasets for Basque, adapting them to the corresponding task formats if necessary. Additionally, BasqueGLUE also includes six new datasets that have not been published before. In total, BasqueGLUE consists of nine Basque NLU tasks and covers a wide range of tasks with different difficulties across several domains. As with the original GLUE benchmark, the training data for the tasks vary in size, which allows to measure the performance of how the models transfer knowledge across tasks.
## Additional Information
### Dataset Curators
Gorka Urbizu [1], Iñaki San Vicente [1], Xabier Saralegi [1], Rodrigo Agerri [2] and Aitor Soroa [2]
Affiliation of the authors:
[1] orai NLP Technologies
[2] HiTZ Center - Ixa, University of the Basque Country UPV/EHU
### Licensing Information
Each dataset of the BasqueGLUE benchmark has it's own license (due to most of them being or being derived from already existing datasets). See their respective README files for details.
Here we provide a brief summary of their licenses:
| Dataset | License |
|---------|---------|
| NERCid | CC BY-NC-SA 4.0 |
| NERCood | CC BY-NC-SA 4.0 |
| FMTODeu_intent | CC BY-NC-SA 4.0 |
| FMTODeu_slot | CC BY-NC-SA 4.0 |
| BHTCv2 | CC BY-NC-SA 4.0 |
| BEC2016eu | Twitter's license + CC BY-NC-SA 4.0 |
| VaxxStance | Twitter's license + CC BY 4.0 |
| QNLIeu | CC BY-SA 4.0 |
| WiCeu | CC BY-NC-SA 4.0 |
| EpecKorrefBin | CC BY-NC-SA 4.0 |
For the rest of the files of the benchmark, including the loading and evaluation scripts, the following license applies:
Copyright (C) by Orai NLP Technologies.
This benchmark and evaluation scripts are licensed under the Creative Commons Attribution Share Alike 4.0
International License (CC BY-SA 4.0). To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
### Citation Information
```
@InProceedings{urbizu2022basqueglue,
author = {Urbizu, Gorka and San Vicente, Iñaki and Saralegi, Xabier and Agerri, Rodrigo and Soroa, Aitor},
title = {BasqueGLUE: A Natural Language Understanding Benchmark for Basque},
booktitle = {Proceedings of the Language Resources and Evaluation Conference},
month = {June},
year = {2022},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {1603--1612},
abstract = {Natural Language Understanding (NLU) technology has improved significantly over the last few years and multitask benchmarks such as GLUE are key to evaluate this improvement in a robust and general way. These benchmarks take into account a wide and diverse set of NLU tasks that require some form of language understanding, beyond the detection of superficial, textual clues. However, they are costly to develop and language-dependent, and therefore they are only available for a small number of languages. In this paper, we present BasqueGLUE, the first NLU benchmark for Basque, a less-resourced language, which has been elaborated from previously existing datasets and following similar criteria to those used for the construction of GLUE and SuperGLUE. We also report the evaluation of two state-of-the-art language models for Basque on BasqueGLUE, thus providing a strong baseline to compare upon. BasqueGLUE is freely available under an open license.},
url = {https://aclanthology.org/2022.lrec-1.172}
}
```
### Contributions
Thanks to [@richplant](https://github.com/richplant) for adding this dataset to hugginface. |
liuyanchen1015/MULTI_VALUE_mnli_progressives | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1104064
num_examples: 4794
- name: dev_mismatched
num_bytes: 1208621
num_examples: 5098
- name: test_matched
num_bytes: 1135615
num_examples: 4901
- name: test_mismatched
num_bytes: 1217348
num_examples: 5184
- name: train
num_bytes: 45857810
num_examples: 195951
download_size: 32015718
dataset_size: 50523458
---
# Dataset Card for "MULTI_VALUE_mnli_progressives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.3_seed_1_t_1.0 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43723341
num_examples: 18928
- name: epoch_1
num_bytes: 44349302
num_examples: 18928
- name: epoch_2
num_bytes: 44438633
num_examples: 18928
- name: epoch_3
num_bytes: 44486939
num_examples: 18928
- name: epoch_4
num_bytes: 44504617
num_examples: 18928
- name: epoch_5
num_bytes: 44505606
num_examples: 18928
- name: epoch_6
num_bytes: 44502945
num_examples: 18928
- name: epoch_7
num_bytes: 44493404
num_examples: 18928
- name: epoch_8
num_bytes: 44487721
num_examples: 18928
- name: epoch_9
num_bytes: 44484647
num_examples: 18928
- name: epoch_10
num_bytes: 44485537
num_examples: 18928
- name: epoch_11
num_bytes: 44484369
num_examples: 18928
- name: epoch_12
num_bytes: 44483539
num_examples: 18928
- name: epoch_13
num_bytes: 44482489
num_examples: 18928
- name: epoch_14
num_bytes: 44480467
num_examples: 18928
- name: epoch_15
num_bytes: 44480879
num_examples: 18928
- name: epoch_16
num_bytes: 44482151
num_examples: 18928
- name: epoch_17
num_bytes: 44481299
num_examples: 18928
- name: epoch_18
num_bytes: 44481753
num_examples: 18928
- name: epoch_19
num_bytes: 44481372
num_examples: 18928
- name: epoch_20
num_bytes: 44480478
num_examples: 18928
- name: epoch_21
num_bytes: 44480671
num_examples: 18928
- name: epoch_22
num_bytes: 44480675
num_examples: 18928
- name: epoch_23
num_bytes: 44481643
num_examples: 18928
- name: epoch_24
num_bytes: 44480720
num_examples: 18928
- name: epoch_25
num_bytes: 44480545
num_examples: 18928
- name: epoch_26
num_bytes: 44481323
num_examples: 18928
- name: epoch_27
num_bytes: 44481728
num_examples: 18928
- name: epoch_28
num_bytes: 44482202
num_examples: 18928
- name: epoch_29
num_bytes: 44482024
num_examples: 18928
download_size: 701111893
dataset_size: 1333613019
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
ctu-aic/qa2d-cs | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: turker_answer
dtype: string
- name: rule-based
dtype: string
- name: dataset
dtype: string
- name: example_uid
dtype: string
splits:
- name: train
num_bytes: 17257995
num_examples: 60710
- name: validation
num_bytes: 2947806
num_examples: 10344
download_size: 14891492
dataset_size: 20205801
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
license: mit
task_categories:
- text2text-generation
language:
- cs
pretty_name: QA2D-cs
size_categories:
- 10K<n<100K
---
Czech version of the Question to Declarative Sentence ([QA2D](https://huggingface.co/datasets/domenicrosati/QA2D)). Machine translated using [DeepL](https://www.deepl.com]) service.
For more information, see our [Pipeline and Dataset Generation for Automated Fact-checking in Almost Any Language](https://arxiv.org/abs/2312.10171) paper.
Currently in review for [NCAA](https://link.springer.com/journal/521) journal.
```bibtex
@article{drchal2023pipeline,
title={Pipeline and Dataset Generation for Automated Fact-checking in Almost Any Language},
author={Drchal, Jan and Ullrich, Herbert and Mlyn{\'a}{\v{r}}, Tom{\'a}{\v{s}} and Moravec, V{\'a}clav},
journal={arXiv preprint arXiv:2312.10171},
year={2023}
}
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.