datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ddrg/math_formulas | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 225647910.0
num_examples: 2886810
- name: test
num_bytes: 23848817.0
num_examples: 311298
download_size: 131762427
dataset_size: 249496727.0
---
# Dataset Card for "math_formulas"
Mathematical dataset containing formulas based on the [AMPS](https://drive.google.com/file/d/1hQsua3TkpEmcJD_UWQx8dmNdEZPyxw23) Khan dataset and the [ARQMath](https://drive.google.com/drive/folders/1YekTVvfmYKZ8I5uiUMbs21G2mKwF9IAm) dataset V1.3. Based on the retrieved LaTeX formulas, more equivalent versions have been generated by applying randomized LaTeX printing with this [SymPy fork](https://drive.google.com/drive/folders/1YekTVvfmYKZ8I5uiUMbs21G2mKwF9IAm). The formulas are intended to be well applicable for MLM. For instance, a masking for a formula like `(a+b)^2 = a^2 + 2ab + b^2` makes sense (e.g., `(a+[MASK])^2 = a^2 + [MASK]ab + b[MASK]2` -> masked tokens are deducable by the context), in contrast, formulas such as `f(x) = 3x+1` are not (e.g., `[MASK](x) = 3x[MASK]1` -> [MASK] tokens are ambigious). |
Nexdata/21404_Images_Human_Posture_Detection_Data_in_Home_Scenes | ---
license: cc-by-nc-nd-4.0
---
## Description
21,404 images - human posture detection data in home scenes. The data scenes are 101 different indoor hone scenes. The gender distribution includes male and female, the age distribution is ranging from young to the elderly, the middle-aged and young people are the majorities. The data diversity includes multiple scenes, multiple time periods, multiple collecting heights, multiple human body occlusions, multiple collecting distances. For collection content, the human body postures data in different home scenes were collected, the human bodies were lying flat, lying on its side or lying on its stomach. For annotation, human body rectangular bounding boxes were annotated. The data can be used for tasks such as human body detection in home scenes.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1348?source=Huggingface
## Data size
21,404 images, one images includes one human body
## Population distribution
gender distribution: male, female; age distribution: ranging from young to the elderly, the middle-aged and young people are the majorities; race distribution: Asian
## Collecting environment
101 different indoor hone scenes
## Data diversity
multiple scenes, multiple time periods, multiple collecting heights, multiple human body occlusions, multiple collecting distances
## Device
surveillance camera, the resolution is 1,920*1,080 or 2,560*1,920
## Collecting angle
looking down angle
## Collecting height
1 meter, 1.5 meters, 2 meters
## Collecting time
day, night
## Collecting time
the image data format is .jpg, the annotation file format is .json or .xml
## Collection content
collecting the human body postures data in different home scenes, the human bodies were lying flat, lying on its side or lying on its stomach
## Annotation content
human body rectangular bounding boxes were annotated
## Accuracy rate
the rectangular bounding box of human body is qualified when the deviation is not more than 3 pixels, and the qualified rate of the bounding boxes shall not be lower than 97%
# Licensing Information
Commercial License
|
Sekiraw/small_generated | ---
dataset_info:
features:
- name: ground_truth
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 5075753.25
num_examples: 21
- name: test
num_bytes: 469111.75
num_examples: 3
- name: validation
num_bytes: 469111.75
num_examples: 3
download_size: 5986306
dataset_size: 6013976.75
---
# Dataset Card for "small_generated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Deathspike/stellvia-of-the-universe | ---
license: cc-by-nc-sa-4.0
---
|
sanbongazin/WilladgeArticle | ---
license: mit
---
|
srivats666/cricket-rules-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6660
num_examples: 101
download_size: 3728
dataset_size: 6660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nerfgun3/enaic31_LoRA | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/enaic31_LoRA/resolve/main/preview/Preview%20(1).png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Enaic31 Artstyle LoRA
# Use Cases
The LoRA is in itself very compatible with the most diverse model. However, it is most effective when used with Kenshi or AbyssOrangeMix2.
The LoRA itself was trained with the token: ```skistyle```.
I would suggest using the token with AbyssOrangeMix2, but not with Kenshi, since I got better results that way.
The models mentioned right now
1. AbyssOrangeMix2 from [WarriorMama777](https://huggingface.co/WarriorMama777/OrangeMixs)
2. Kenshi Model from [Luna](https://huggingface.co/SweetLuna/Kenshi)
## Strength
I would personally use these strength with the assosiated model:
Soft-Version:
- 0.6-0.85 for AbyssOrangeMix2
- 0.5-0.75 for Kenshi
Hard-Version:
- 0.4-0.6 for AbyssOrangeMix2
- 0.3-0.55 for Kenshi
# Showcase
**Example 1**
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/enaic31_LoRA/resolve/main/preview/Preview%20(2).png"/>
```
skistyle,
1girl, solo, animal ears, long hair, looking at viewer, bell, upper body, bangs, closed mouth, animal ear fluff, hair between eyes, grey eyes, blush, grey hair, cat ears, neck bell, shirt,
Steps: 32, Sampler: Euler a, CFG scale: 7
```
**Example 2**
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/enaic31_LoRA/resolve/main/preview/Preview%20(3).png"/>
```
skistyle,
1girl, solo, animal ears, long hair, looking at viewer, bell, upper body, bangs, closed mouth, animal ear fluff, hair between eyes, grey eyes, blush, grey hair, cat ears, neck bell, shirt,
Steps: 32, Sampler: Euler a, CFG scale: 7
```
**Example 3**
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/enaic31_LoRA/resolve/main/preview/Preview%20(4).png"/>
```
skistyle,
small breasts, dark-skinned female, shorts, dark skin, hair ornament, black hair, smile, glasses, v, cleavage, hairclip, brown hair, grin, aged up, brown eyes, white background, 1girl, looking at viewer, off shoulder, shirt, sweater, simple background, short shorts, denim shorts
Steps: 32, Sampler: Euler a, CFG scale: 7
```
# License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
AshtonLKY/resampled_audio_morethan4 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcript
dtype: string
splits:
- name: train
num_bytes: 3245378777.43
num_examples: 21690
download_size: 3671209553
dataset_size: 3245378777.43
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davanstrien/testgitupload | ---
tags:
- arxiv:2211.10086
---
TEST |
liuyanchen1015/MULTI_VALUE_mnli_volition_changes | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 22327
num_examples: 95
- name: dev_mismatched
num_bytes: 24922
num_examples: 129
- name: test_matched
num_bytes: 41300
num_examples: 161
- name: test_mismatched
num_bytes: 23503
num_examples: 125
- name: train
num_bytes: 1381932
num_examples: 5871
download_size: 851848
dataset_size: 1493984
---
# Dataset Card for "MULTI_VALUE_mnli_volition_changes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1 | ---
pretty_name: Evaluation run of Xenon1/Zenith-7B-dpo-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xenon1/Zenith-7B-dpo-v1](https://huggingface.co/Xenon1/Zenith-7B-dpo-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T00:49:59.820976](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1/blob/main/results_2024-02-15T00-49-59.820976.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5994007032450553,\n\
\ \"acc_stderr\": 0.03314392404148924,\n \"acc_norm\": 0.6077867814262741,\n\
\ \"acc_norm_stderr\": 0.033870966769135216,\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863127,\n \"mc2\": 0.6059869573691794,\n\
\ \"mc2_stderr\": 0.015948076495091498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627082,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938163\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6405098585939056,\n\
\ \"acc_stderr\": 0.004788703173474748,\n \"acc_norm\": 0.8295160326628161,\n\
\ \"acc_norm_stderr\": 0.003752888662249574\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246494,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246494\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164525,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164525\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616265,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616265\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n\
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281344,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281344\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946002,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946002\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.293854748603352,\n\
\ \"acc_stderr\": 0.015235075776719608,\n \"acc_norm\": 0.293854748603352,\n\
\ \"acc_norm_stderr\": 0.015235075776719608\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.026730620728004913,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.026730620728004913\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4067796610169492,\n\
\ \"acc_stderr\": 0.012546325596569525,\n \"acc_norm\": 0.4067796610169492,\n\
\ \"acc_norm_stderr\": 0.012546325596569525\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863127,\n \"mc2\": 0.6059869573691794,\n\
\ \"mc2_stderr\": 0.015948076495091498\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16982562547384383,\n \
\ \"acc_stderr\": 0.01034257236086122\n }\n}\n```"
repo_url: https://huggingface.co/Xenon1/Zenith-7B-dpo-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|arc:challenge|25_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|arc:challenge|25_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|gsm8k|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|gsm8k|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hellaswag|10_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hellaswag|10_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-43-26.430787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T00-49-59.820976.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- '**/details_harness|winogrande|5_2024-02-15T00-43-26.430787.parquet'
- split: 2024_02_15T00_49_59.820976
path:
- '**/details_harness|winogrande|5_2024-02-15T00-49-59.820976.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T00-49-59.820976.parquet'
- config_name: results
data_files:
- split: 2024_02_15T00_43_26.430787
path:
- results_2024-02-15T00-43-26.430787.parquet
- split: 2024_02_15T00_49_59.820976
path:
- results_2024-02-15T00-49-59.820976.parquet
- split: latest
path:
- results_2024-02-15T00-49-59.820976.parquet
---
# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B-dpo-v1](https://huggingface.co/Xenon1/Zenith-7B-dpo-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T00:49:59.820976](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1/blob/main/results_2024-02-15T00-49-59.820976.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5994007032450553,
"acc_stderr": 0.03314392404148924,
"acc_norm": 0.6077867814262741,
"acc_norm_stderr": 0.033870966769135216,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863127,
"mc2": 0.6059869573691794,
"mc2_stderr": 0.015948076495091498
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627082,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938163
},
"harness|hellaswag|10": {
"acc": 0.6405098585939056,
"acc_stderr": 0.004788703173474748,
"acc_norm": 0.8295160326628161,
"acc_norm_stderr": 0.003752888662249574
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246494,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246494
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164525,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164525
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616265,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616265
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281344,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281344
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946002,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946002
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.293854748603352,
"acc_stderr": 0.015235075776719608,
"acc_norm": 0.293854748603352,
"acc_norm_stderr": 0.015235075776719608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004913,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004913
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4067796610169492,
"acc_stderr": 0.012546325596569525,
"acc_norm": 0.4067796610169492,
"acc_norm_stderr": 0.012546325596569525
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863127,
"mc2": 0.6059869573691794,
"mc2_stderr": 0.015948076495091498
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
},
"harness|gsm8k|5": {
"acc": 0.16982562547384383,
"acc_stderr": 0.01034257236086122
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/trec_cast_offsets | ---
license: lgpl
---
# Dataset Card for Dataset Name
This is a complement to the TREC CaST (2020-22) datasets, with pre-computed offset relative to the original files. |
fvr2/dataset-test03 | ---
license: other
task_categories:
- text-generation
language:
- en
tags:
- art
--- |
killameep/protogen-data | ---
dataset_info:
features:
- name: source_id
dtype: string
- name: source
dtype: string
- name: image
dtype: image
- name: tags
sequence: string
- name: url
dtype: string
- name: text
dtype: string
- name: selector
dtype: string
splits:
- name: train
num_bytes: 214815973.0
num_examples: 512
download_size: 212424717
dataset_size: 214815973.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "protogen-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eididkd/Pearl | ---
license: openrail
---
|
autoevaluate/autoeval-eval-project-quoref-bbfe943f-1305449898 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- quoref
eval_info:
task: extractive_question_answering
model: nbroad/deb-base-gc2
metrics: []
dataset_name: quoref
dataset_config: default
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: nbroad/deb-base-gc2
* Dataset: quoref
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model. |
distilled-one-sec-cv12-each-chunk-uniq/chunk_51 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1248554016.0
num_examples: 243288
download_size: 1280261390
dataset_size: 1248554016.0
---
# Dataset Card for "chunk_51"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Crystalcareai/MoD | ---
license: apache-2.0
datasets:
- jsonifize/Tested-188k-Python-Alpaca_stringified-jsonifize
- Norquinal/WizardLM_alpaca_claude_evol_instruct_70k
- allenai/ai2_arc
- Squish42/bluemoon-fandom-1-1-rp-cleaned
- google/boolq
- LDJnr/Capybara
- mattpscott/airoboros-summarization
- Locutusque/Hercules-v1.0
- lmsys/lmsys-chat-1m
- Muennighoff/natural-instructions
- HuggingFaceH4/no_robots
- grimulkan/PIPPA-augmented-dedup
- euclaise/reddit-instruct
- teknium/OpenHermes-2.5
- ropes
- Open-Orca/SlimOrca-Dedup
- migtissera/Synthia-v1.3
- HuggingFaceH4/ultrachat_200k
- winogrande
- CollectiveCognition/chats-data-2023-09-22
- CollectiveCognition/chats-data-2023-09-27
- CollectiveCognition/chats-data-2023-10-16
- Locutusque/GPT4-LLM-Cleaned-chatml
- Locutusque/GPT4-roleplay-chatml
- Locutusque/GPT4-roleplay-v2-chatml
- Locutusque/WizardLM_evol_instruct_70k_chatml
- Locutusque/camel-chatml
- Locutusque/code-assistant-chatml
- Locutusque/code-assistant-v2-chatml
- Locutusque/dolphin-gpt4-chatml
- Locutusque/function-calling-chatml
- Locutusque/general-instruct-chatml
- Locutusque/lmsys-chat-1m-best
- Locutusque/medtext-chatml
- Locutusque/metamathqa-chatml
- Locutusque/platypus-chatml
- Locutusque/pubmedqa-chatml
- Locutusque/unnatural-instructions-chatml
---
# Please note this is a dataset that accompanies the model; https://huggingface.co/Crystalcareai/Qwen1.5-8x7b. The readme is the same for both, with more detail below
## Hey, I'm Lucas
I'm excited to share an early release of a project that has kept me busy for the last couple of weeks. Mixtral's release propelled me into a deep dive into MoEs.
With the release of Qwen1.5, I was curious to see how it would compare to Mixtral.
Coming from a background as an acting teacher and coach, I saw parallels between high-quality scripts' impact on performances and the importance of curating high-quality data for training models. This led me to explore data curation, especially for training Mixture of Experts (MoE) models. I looked into Teknium's OpenHermes dataset, Jon Durbin's collections on GitHub, and Eric Hartford's methods for achieving specific outcomes with models.
I curated a dataset, named Mixture of Data (MoD), from various sources, including Bagel, OpenHermes, and many more, totaling about 780,000 distinct ShareGPT conversations. This dataset aims to encourage MoE models to develop their own distinct experts.
After training Qwen1.5-7b on 100k random samples from MoD over four epochs and merging the fine-tuned model 8x, I used an approach utilizing a random gate, without specialized fine-tuning done to any of the 8 experts. The result was a model that initially made no sense, lacking a base model and clear guidance on expert usage.
Despite challenges, such as training interruptions via cuda errors with Runpod , the model showed promising adaptability to the rest of the MoD dataset, even with limited training (0.45/4 planned epochs were completed before my compute budget ran out). It performs comparably to Mixtral in (admittedly naive) preliminary reasoning tests.
These weeks have been incredibly rewarding and educational, thanks to the contributions of Jon Durbin, Maxime Labonne, Teknium, Eric Hartford, and Charles Goddard. Their work has made these technologies accessible and inspired my project. A special thank you to Teknium and Eric Hartford, who have been generous with their time - answering my questions with kindness and humility.
I am currently training a 2.0 model - that I expect to beat Mixtral on most benchmarks. Thank you for your interest and support. Let's push the boundaries of what's possible together.
Lucas
datasets used:
- jsonifize/Tested-188k-Python-Alpaca_stringified-jsonifize
- Norquinal/WizardLM_alpaca_claude_evol_instruct_70k
- allenai/ai2_arc
- Squish42/bluemoon-fandom-1-1-rp-cleaned
- google/boolq
- LDJnr/Capybara
- mattpscott/airoboros-summarization
- Locutusque/Hercules-v1.0
- lmsys/lmsys-chat-1m
- Muennighoff/natural-instructions
- HuggingFaceH4/no_robots
- grimulkan/PIPPA-augmented-dedup
- euclaise/reddit-instruct
- teknium/OpenHermes-2.5
- ropes
- Open-Orca/SlimOrca-Dedup
- migtissera/Synthia-v1.3
- HuggingFaceH4/ultrachat_200k
- winogrande
- CollectiveCognition/chats-data-2023-09-22
- CollectiveCognition/chats-data-2023-09-27
- CollectiveCognition/chats-data-2023-10-16
- Locutusque/GPT4-LLM-Cleaned-chatml
- Locutusque/GPT4-roleplay-chatml
- Locutusque/GPT4-roleplay-v2-chatml
- Locutusque/WizardLM_evol_instruct_70k_chatml
- Locutusque/camel-chatml
- Locutusque/code-assistant-chatml
- Locutusque/code-assistant-v2-chatml
- Locutusque/dolphin-gpt4-chatml
- Locutusque/function-calling-chatml
- Locutusque/general-instruct-chatml
- Locutusque/lmsys-chat-1m-best
- Locutusque/medtext-chatml
- Locutusque/metamathqa-chatml
- Locutusque/platypus-chatml
- Locutusque/pubmedqa-chatml
- Locutusque/unnatural-instructions-chatml
|
cakiki/ASE_runs | ---
license: apache-2.0
---
|
joey234/conandoyle_cue_scope | ---
dataset_info:
features:
- name: text
dtype: string
- name: cue
sequence: int64
- name: scope
sequence: int64
splits:
- name: train
num_bytes: 105721
num_examples: 235
download_size: 21262
dataset_size: 105721
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "conandoyle_cue_scope"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
compguesswhat | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|other-guesswhat
task_categories:
- visual-question-answering
task_ids:
- visual-question-answering
paperswithcode_id: compguesswhat
pretty_name: CompGuessWhat?!
dataset_info:
- config_name: compguesswhat-original
features:
- name: id
dtype: int32
- name: target_id
dtype: int32
- name: timestamp
dtype: string
- name: status
dtype: string
- name: image
struct:
- name: id
dtype: int32
- name: file_name
dtype: string
- name: flickr_url
dtype: string
- name: coco_url
dtype: string
- name: height
dtype: int32
- name: width
dtype: int32
- name: visual_genome
struct:
- name: width
dtype: int32
- name: height
dtype: int32
- name: url
dtype: string
- name: coco_id
dtype: int32
- name: flickr_id
dtype: string
- name: image_id
dtype: string
- name: qas
sequence:
- name: question
dtype: string
- name: answer
dtype: string
- name: id
dtype: int32
- name: objects
sequence:
- name: id
dtype: int32
- name: bbox
sequence: float32
length: 4
- name: category
dtype: string
- name: area
dtype: float32
- name: category_id
dtype: int32
- name: segment
sequence:
sequence: float32
splits:
- name: train
num_bytes: 123556580
num_examples: 46341
- name: validation
num_bytes: 25441428
num_examples: 9738
- name: test
num_bytes: 25369227
num_examples: 9621
download_size: 105349759
dataset_size: 174367235
- config_name: compguesswhat-zero_shot
features:
- name: id
dtype: int32
- name: target_id
dtype: string
- name: status
dtype: string
- name: image
struct:
- name: id
dtype: int32
- name: file_name
dtype: string
- name: coco_url
dtype: string
- name: height
dtype: int32
- name: width
dtype: int32
- name: license
dtype: int32
- name: open_images_id
dtype: string
- name: date_captured
dtype: string
- name: objects
sequence:
- name: id
dtype: string
- name: bbox
sequence: float32
length: 4
- name: category
dtype: string
- name: area
dtype: float32
- name: category_id
dtype: int32
- name: IsOccluded
dtype: int32
- name: IsTruncated
dtype: int32
- name: segment
sequence:
- name: MaskPath
dtype: string
- name: LabelName
dtype: string
- name: BoxID
dtype: string
- name: BoxXMin
dtype: string
- name: BoxXMax
dtype: string
- name: BoxYMin
dtype: string
- name: BoxYMax
dtype: string
- name: PredictedIoU
dtype: string
- name: Clicks
dtype: string
splits:
- name: nd_valid
num_bytes: 13510589
num_examples: 5343
- name: nd_test
num_bytes: 36228021
num_examples: 13836
- name: od_valid
num_bytes: 14051972
num_examples: 5372
- name: od_test
num_bytes: 32950869
num_examples: 13300
download_size: 6548812
dataset_size: 96741451
configs:
- config_name: compguesswhat-original
data_files:
- split: train
path: compguesswhat-original/train-*
- split: validation
path: compguesswhat-original/validation-*
- split: test
path: compguesswhat-original/test-*
- config_name: compguesswhat-zero_shot
data_files:
- split: nd_valid
path: compguesswhat-zero_shot/nd_valid-*
- split: nd_test
path: compguesswhat-zero_shot/nd_test-*
- split: od_valid
path: compguesswhat-zero_shot/od_valid-*
- split: od_test
path: compguesswhat-zero_shot/od_test-*
---
# Dataset Card for "compguesswhat"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://compguesswhat.github.io/](https://compguesswhat.github.io/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** https://arxiv.org/abs/2006.02174
- **Paper:** https://doi.org/10.18653/v1/2020.acl-main.682
- **Point of Contact:** [Alessandro Suglia](mailto:alessandro.suglia@gmail.com)
- **Size of downloaded dataset files:** 112.05 MB
- **Size of the generated dataset:** 271.11 MB
- **Total amount of disk used:** 383.16 MB
### Dataset Summary
CompGuessWhat?! is an instance of a multi-task framework for evaluating the quality of learned neural representations,
in particular concerning attribute grounding. Use this dataset if you want to use the set of games whose reference
scene is an image in VisualGenome. Visit the website for more details: https://compguesswhat.github.io
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### compguesswhat-original
- **Size of downloaded dataset files:** 107.21 MB
- **Size of the generated dataset:** 174.37 MB
- **Total amount of disk used:** 281.57 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"id": 2424,
"image": "{\"coco_url\": \"http://mscoco.org/images/270512\", \"file_name\": \"COCO_train2014_000000270512.jpg\", \"flickr_url\": \"http://farm6.stat...",
"objects": "{\"area\": [1723.5133056640625, 4838.5361328125, 287.44476318359375, 44918.7109375, 3688.09375, 522.1935424804688], \"bbox\": [[5.61...",
"qas": {
"answer": ["Yes", "No", "No", "Yes"],
"id": [4983, 4996, 5006, 5017],
"question": ["Is it in the foreground?", "Does it have wings?", "Is it a person?", "Is it a vehicle?"]
},
"status": "success",
"target_id": 1197044,
"timestamp": "2016-07-08 15:07:38"
}
```
#### compguesswhat-zero_shot
- **Size of downloaded dataset files:** 4.84 MB
- **Size of the generated dataset:** 96.74 MB
- **Total amount of disk used:** 101.59 MB
An example of 'nd_valid' looks as follows.
```
This example was too long and was cropped:
{
"id": 0,
"image": {
"coco_url": "https://s3.amazonaws.com/nocaps/val/004e21eb2e686f40.jpg",
"date_captured": "2018-11-06 11:04:33",
"file_name": "004e21eb2e686f40.jpg",
"height": 1024,
"id": 6,
"license": 0,
"open_images_id": "004e21eb2e686f40",
"width": 768
},
"objects": "{\"IsOccluded\": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], \"IsTruncated\": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], \"area\": [3...",
"status": "incomplete",
"target_id": "004e21eb2e686f40_30"
}
```
### Data Fields
The data fields are the same among all splits.
#### compguesswhat-original
- `id`: a `int32` feature.
- `target_id`: a `int32` feature.
- `timestamp`: a `string` feature.
- `status`: a `string` feature.
- `id`: a `int32` feature.
- `file_name`: a `string` feature.
- `flickr_url`: a `string` feature.
- `coco_url`: a `string` feature.
- `height`: a `int32` feature.
- `width`: a `int32` feature.
- `width`: a `int32` feature.
- `height`: a `int32` feature.
- `url`: a `string` feature.
- `coco_id`: a `int32` feature.
- `flickr_id`: a `string` feature.
- `image_id`: a `string` feature.
- `qas`: a dictionary feature containing:
- `question`: a `string` feature.
- `answer`: a `string` feature.
- `id`: a `int32` feature.
- `objects`: a dictionary feature containing:
- `id`: a `int32` feature.
- `bbox`: a `list` of `float32` features.
- `category`: a `string` feature.
- `area`: a `float32` feature.
- `category_id`: a `int32` feature.
- `segment`: a dictionary feature containing:
- `feature`: a `float32` feature.
#### compguesswhat-zero_shot
- `id`: a `int32` feature.
- `target_id`: a `string` feature.
- `status`: a `string` feature.
- `id`: a `int32` feature.
- `file_name`: a `string` feature.
- `coco_url`: a `string` feature.
- `height`: a `int32` feature.
- `width`: a `int32` feature.
- `license`: a `int32` feature.
- `open_images_id`: a `string` feature.
- `date_captured`: a `string` feature.
- `objects`: a dictionary feature containing:
- `id`: a `string` feature.
- `bbox`: a `list` of `float32` features.
- `category`: a `string` feature.
- `area`: a `float32` feature.
- `category_id`: a `int32` feature.
- `IsOccluded`: a `int32` feature.
- `IsTruncated`: a `int32` feature.
- `segment`: a dictionary feature containing:
- `MaskPath`: a `string` feature.
- `LabelName`: a `string` feature.
- `BoxID`: a `string` feature.
- `BoxXMin`: a `string` feature.
- `BoxXMax`: a `string` feature.
- `BoxYMin`: a `string` feature.
- `BoxYMax`: a `string` feature.
- `PredictedIoU`: a `string` feature.
- `Clicks`: a `string` feature.
### Data Splits
#### compguesswhat-original
| |train|validation|test|
|----------------------|----:|---------:|---:|
|compguesswhat-original|46341| 9738|9621|
#### compguesswhat-zero_shot
| |nd_valid|od_valid|nd_test|od_test|
|-----------------------|-------:|-------:|------:|------:|
|compguesswhat-zero_shot| 5343| 5372| 13836| 13300|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{suglia-etal-2020-compguesswhat,
title = "{C}omp{G}uess{W}hat?!: A Multi-task Evaluation Framework for Grounded Language Learning",
author = "Suglia, Alessandro and
Konstas, Ioannis and
Vanzo, Andrea and
Bastianelli, Emanuele and
Elliott, Desmond and
Frank, Stella and
Lemon, Oliver",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.682",
pages = "7625--7641",
abstract = "Approaches to Grounded Language Learning are commonly focused on a single task-based final performance measure which may not depend on desirable properties of the learned hidden representations, such as their ability to predict object attributes or generalize to unseen situations. To remedy this, we present GroLLA, an evaluation framework for Grounded Language Learning with Attributes based on three sub-tasks: 1) Goal-oriented evaluation; 2) Object attribute prediction evaluation; and 3) Zero-shot evaluation. We also propose a new dataset CompGuessWhat?! as an instance of this framework for evaluating the quality of learned neural representations, in particular with respect to attribute grounding. To this end, we extend the original GuessWhat?! dataset by including a semantic layer on top of the perceptual one. Specifically, we enrich the VisualGenome scene graphs associated with the GuessWhat?! images with several attributes from resources such as VISA and ImSitu. We then compare several hidden state representations from current state-of-the-art approaches to Grounded Language Learning. By using diagnostic classifiers, we show that current models{'} learned representations are not expressive enough to encode object attributes (average F1 of 44.27). In addition, they do not learn strategies nor representations that are robust enough to perform well when novel scenes or objects are involved in gameplay (zero-shot best accuracy 50.06{\%}).",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@aleSuglia](https://github.com/aleSuglia), [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
NobodyExistsOnTheInternet/SystemMessageContradictionsSharegpt | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: system message
dtype: string
- name: reversed sysmsg
dtype: string
- name: reversed response
dtype: string
splits:
- name: train
num_bytes: 1286917008
num_examples: 90258
download_size: 392770253
dataset_size: 1286917008
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zetavg/wikipedia_random_page_summaries_zh_tw_5k | ---
dataset_info:
features:
- name: page_title
dtype: string
- name: page_summary
dtype: string
splits:
- name: train
num_bytes: 2053192
num_examples: 4996
download_size: 1498828
dataset_size: 2053192
language:
- zh
---
# Dataset Card for "wikipedia_random_page_summaries_zh_tw_5k"
`page_title` 是維基百科原始的頁面名稱,因此可能是簡體中文。`page_summary` 則一律是台灣正體版本。
使用了 [vinta/pangu](https://github.com/vinta/pangu.js) 來確保中英文之間都有加上空格。
由 https://github.com/zetavg/LLM-Research/blob/3b79836/Wikipedia_Random_Page_Summaries_Dataset_Generator.ipynb 產生。 |
CVasNLPExperiments/Caltech101_with_background_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_300 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 126783
num_examples: 300
download_size: 24456
dataset_size: 126783
---
# Dataset Card for "Caltech101_with_background_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuqiyun/jojo-stands | ---
license: cc-by-4.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/85515c38 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1340
dataset_size: 182
---
# Dataset Card for "85515c38"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KoboldAI__Mistral-7B-Holodeck-1 | ---
pretty_name: Evaluation run of KoboldAI/Mistral-7B-Holodeck-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/Mistral-7B-Holodeck-1](https://huggingface.co/KoboldAI/Mistral-7B-Holodeck-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__Mistral-7B-Holodeck-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T04:18:05.074258](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__Mistral-7B-Holodeck-1/blob/main/results_2024-02-20T04-18-05.074258.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6230452183938985,\n\
\ \"acc_stderr\": 0.03270548174761643,\n \"acc_norm\": 0.6296193199823876,\n\
\ \"acc_norm_stderr\": 0.03337093310013982,\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219374,\n \"mc2\": 0.4152659088750174,\n\
\ \"mc2_stderr\": 0.014077149593469703\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064664,\n\
\ \"acc_norm\": 0.6023890784982935,\n \"acc_norm_stderr\": 0.01430175222327954\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6232822146982673,\n\
\ \"acc_stderr\": 0.004835728903731397,\n \"acc_norm\": 0.8253335988846843,\n\
\ \"acc_norm_stderr\": 0.0037890554870031886\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066468,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066468\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203624,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.03096451792692339,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.03096451792692339\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520983,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520983\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28044692737430166,\n\
\ \"acc_stderr\": 0.015024083883322877,\n \"acc_norm\": 0.28044692737430166,\n\
\ \"acc_norm_stderr\": 0.015024083883322877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.43285528031290743,\n \"acc_stderr\": 0.012654565234622866,\n\
\ \"acc_norm\": 0.43285528031290743,\n \"acc_norm_stderr\": 0.012654565234622866\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"\
acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249776,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249776\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219374,\n \"mc2\": 0.4152659088750174,\n\
\ \"mc2_stderr\": 0.014077149593469703\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3366186504927976,\n \
\ \"acc_stderr\": 0.013016463679983364\n }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/Mistral-7B-Holodeck-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|arc:challenge|25_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|gsm8k|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hellaswag|10_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-18-05.074258.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T04-18-05.074258.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- '**/details_harness|winogrande|5_2024-02-20T04-18-05.074258.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T04-18-05.074258.parquet'
- config_name: results
data_files:
- split: 2024_02_20T04_18_05.074258
path:
- results_2024-02-20T04-18-05.074258.parquet
- split: latest
path:
- results_2024-02-20T04-18-05.074258.parquet
---
# Dataset Card for Evaluation run of KoboldAI/Mistral-7B-Holodeck-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KoboldAI/Mistral-7B-Holodeck-1](https://huggingface.co/KoboldAI/Mistral-7B-Holodeck-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__Mistral-7B-Holodeck-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T04:18:05.074258](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__Mistral-7B-Holodeck-1/blob/main/results_2024-02-20T04-18-05.074258.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6230452183938985,
"acc_stderr": 0.03270548174761643,
"acc_norm": 0.6296193199823876,
"acc_norm_stderr": 0.03337093310013982,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219374,
"mc2": 0.4152659088750174,
"mc2_stderr": 0.014077149593469703
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064664,
"acc_norm": 0.6023890784982935,
"acc_norm_stderr": 0.01430175222327954
},
"harness|hellaswag|10": {
"acc": 0.6232822146982673,
"acc_stderr": 0.004835728903731397,
"acc_norm": 0.8253335988846843,
"acc_norm_stderr": 0.0037890554870031886
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066468,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066468
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203624,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.03096451792692339,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.03096451792692339
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520983,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520983
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28044692737430166,
"acc_stderr": 0.015024083883322877,
"acc_norm": 0.28044692737430166,
"acc_norm_stderr": 0.015024083883322877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622866,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622866
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249776,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249776
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219374,
"mc2": 0.4152659088750174,
"mc2_stderr": 0.014077149593469703
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.011878201073856544
},
"harness|gsm8k|5": {
"acc": 0.3366186504927976,
"acc_stderr": 0.013016463679983364
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zolak/twitter_dataset_1712978129 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2820716
num_examples: 9647
download_size: 1463840
dataset_size: 2820716
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_completive_finish | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 188294
num_examples: 793
- name: dev_mismatched
num_bytes: 201898
num_examples: 788
- name: test_matched
num_bytes: 222194
num_examples: 875
- name: test_mismatched
num_bytes: 199177
num_examples: 826
- name: train
num_bytes: 8086966
num_examples: 32860
download_size: 5374782
dataset_size: 8898529
---
# Dataset Card for "MULTI_VALUE_mnli_completive_finish"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_InnerI__InnerIAI-chat-7b-grok | ---
pretty_name: Evaluation run of InnerI/InnerIAI-chat-7b-grok
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [InnerI/InnerIAI-chat-7b-grok](https://huggingface.co/InnerI/InnerIAI-chat-7b-grok)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InnerI__InnerIAI-chat-7b-grok\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T14:54:07.478425](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerIAI-chat-7b-grok/blob/main/results_2024-03-24T14-54-07.478425.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5353498519757834,\n\
\ \"acc_stderr\": 0.033750047702372644,\n \"acc_norm\": 0.5419252013103447,\n\
\ \"acc_norm_stderr\": 0.03448883508762743,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.46558746897101605,\n\
\ \"mc2_stderr\": 0.014978038075716977\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4854948805460751,\n \"acc_stderr\": 0.014605241081370053,\n\
\ \"acc_norm\": 0.5213310580204779,\n \"acc_norm_stderr\": 0.014598087973127108\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5652260505875324,\n\
\ \"acc_stderr\": 0.004947141797384127,\n \"acc_norm\": 0.7538338976299542,\n\
\ \"acc_norm_stderr\": 0.0042989606748115765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197946,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197946\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602841997,\n \"\
acc_norm\": 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602841997\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6096774193548387,\n \"acc_stderr\": 0.027751256636969576,\n \"\
acc_norm\": 0.6096774193548387,\n \"acc_norm_stderr\": 0.027751256636969576\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n \"\
acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713546,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713546\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700304,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700304\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934837,\n\
\ \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934837\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\
\ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871627,\n \"\
acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656628,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656628\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6624472573839663,\n \"acc_stderr\": 0.030781549102026223,\n \
\ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.030781549102026223\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392943,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392943\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n\
\ \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.7381864623243933,\n\
\ \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.015166544550490301,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.015166544550490301\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631452,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.027466610213140105,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.027466610213140105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722415005,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722415005\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3859191655801825,\n\
\ \"acc_stderr\": 0.012433398911476143,\n \"acc_norm\": 0.3859191655801825,\n\
\ \"acc_norm_stderr\": 0.012433398911476143\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5294117647058824,\n \"acc_stderr\": 0.02019280827143379,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.02019280827143379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913509,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.46558746897101605,\n\
\ \"mc2_stderr\": 0.014978038075716977\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7229676400947119,\n \"acc_stderr\": 0.012577891015342416\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18423047763457165,\n \
\ \"acc_stderr\": 0.010678414428555006\n }\n}\n```"
repo_url: https://huggingface.co/InnerI/InnerIAI-chat-7b-grok
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|arc:challenge|25_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|gsm8k|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hellaswag|10_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-54-07.478425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T14-54-07.478425.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- '**/details_harness|winogrande|5_2024-03-24T14-54-07.478425.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T14-54-07.478425.parquet'
- config_name: results
data_files:
- split: 2024_03_24T14_54_07.478425
path:
- results_2024-03-24T14-54-07.478425.parquet
- split: latest
path:
- results_2024-03-24T14-54-07.478425.parquet
---
# Dataset Card for Evaluation run of InnerI/InnerIAI-chat-7b-grok
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InnerI/InnerIAI-chat-7b-grok](https://huggingface.co/InnerI/InnerIAI-chat-7b-grok) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InnerI__InnerIAI-chat-7b-grok",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T14:54:07.478425](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerIAI-chat-7b-grok/blob/main/results_2024-03-24T14-54-07.478425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5353498519757834,
"acc_stderr": 0.033750047702372644,
"acc_norm": 0.5419252013103447,
"acc_norm_stderr": 0.03448883508762743,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.46558746897101605,
"mc2_stderr": 0.014978038075716977
},
"harness|arc:challenge|25": {
"acc": 0.4854948805460751,
"acc_stderr": 0.014605241081370053,
"acc_norm": 0.5213310580204779,
"acc_norm_stderr": 0.014598087973127108
},
"harness|hellaswag|10": {
"acc": 0.5652260505875324,
"acc_stderr": 0.004947141797384127,
"acc_norm": 0.7538338976299542,
"acc_norm_stderr": 0.0042989606748115765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197946,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197946
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602841997,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602841997
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6096774193548387,
"acc_stderr": 0.027751256636969576,
"acc_norm": 0.6096774193548387,
"acc_norm_stderr": 0.027751256636969576
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713546,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713546
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700304,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700304
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934837,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934837
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.019266055045871627,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.019266055045871627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656628,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656628
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.030781549102026223,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.030781549102026223
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392943,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392943
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531018,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531018
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490301,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490301
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631452,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.027466610213140105,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.027466610213140105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722415005,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722415005
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3859191655801825,
"acc_stderr": 0.012433398911476143,
"acc_norm": 0.3859191655801825,
"acc_norm_stderr": 0.012433398911476143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.02019280827143379,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.02019280827143379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913509,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.46558746897101605,
"mc2_stderr": 0.014978038075716977
},
"harness|winogrande|5": {
"acc": 0.7229676400947119,
"acc_stderr": 0.012577891015342416
},
"harness|gsm8k|5": {
"acc": 0.18423047763457165,
"acc_stderr": 0.010678414428555006
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
terworld/pic | ---
license: openrail
---
|
alex43219/prolog-dataset-full | ---
annotations_creators:
- machine-generated
language:
- code
language_creators:
- crowdsourced
license: []
multilinguality:
- monolingual
pretty_name: Prolog dataset
size_categories:
- 100K<n<1M
source_datasets: []
tags: []
task_categories:
- other
task_ids: []
---
Dataset with Prolog code / query pairs and execution results. |
dipesh/Intent-Classification-small | ---
dataset_info:
features:
- name: text
dtype: string
- name: intent
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: label
dtype:
class_label:
names:
'0': goodbye
'1': volume control
'2': play games
'3': covid cases
'4': open website
'5': tell me joke
'6': play on youtube
'7': places near me
'8': greet and hello hi kind of things, general check in
'9': asking time
'10': asking date
'11': tell me news
'12': asking weather
'13': download youtube video
'14': what can you do
'15': take screenshot
'16': send email
'17': i am bored
'18': click photo
'19': tell me about
'20': send whatsapp message
splits:
- name: train
num_bytes: 630723
num_examples: 6153
- name: validation
num_bytes: 71230
num_examples: 684
download_size: 201336
dataset_size: 701953
---
# Dataset Card for "Intent-Classification-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP | ---
pretty_name: Evaluation run of Kabster/BioMistral-Zephyr-Beta-SLERP
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kabster/BioMistral-Zephyr-Beta-SLERP](https://huggingface.co/Kabster/BioMistral-Zephyr-Beta-SLERP)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T23:17:12.005512](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP/blob/main/results_2024-03-09T23-17-12.005512.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5995043159633443,\n\
\ \"acc_stderr\": 0.033015404417283706,\n \"acc_norm\": 0.6105459399539238,\n\
\ \"acc_norm_stderr\": 0.03391082208761106,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190792,\n \"mc2\": 0.5460488636416867,\n\
\ \"mc2_stderr\": 0.015366957850368226\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216386,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6591316470822546,\n\
\ \"acc_stderr\": 0.004730324556624127,\n \"acc_norm\": 0.8412666799442342,\n\
\ \"acc_norm_stderr\": 0.003646803899770339\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443865,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443865\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.02564938106302925,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.02564938106302925\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605247,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605247\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946012,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946012\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.0150603817300181,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.0150603817300181\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558562,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558562\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281504,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190792,\n \"mc2\": 0.5460488636416867,\n\
\ \"mc2_stderr\": 0.015366957850368226\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Kabster/BioMistral-Zephyr-Beta-SLERP
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|winogrande|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T23-17-12.005512.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- results_2024-03-09T23-17-12.005512.parquet
- split: latest
path:
- results_2024-03-09T23-17-12.005512.parquet
---
# Dataset Card for Evaluation run of Kabster/BioMistral-Zephyr-Beta-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kabster/BioMistral-Zephyr-Beta-SLERP](https://huggingface.co/Kabster/BioMistral-Zephyr-Beta-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T23:17:12.005512](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP/blob/main/results_2024-03-09T23-17-12.005512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5995043159633443,
"acc_stderr": 0.033015404417283706,
"acc_norm": 0.6105459399539238,
"acc_norm_stderr": 0.03391082208761106,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190792,
"mc2": 0.5460488636416867,
"mc2_stderr": 0.015366957850368226
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216386,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6591316470822546,
"acc_stderr": 0.004730324556624127,
"acc_norm": 0.8412666799442342,
"acc_norm_stderr": 0.003646803899770339
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443865,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443865
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302925,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302925
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605247,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605247
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703643,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703643
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946012,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.0150603817300181,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.0150603817300181
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558562,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558562
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281504,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190792,
"mc2": 0.5460488636416867,
"mc2_stderr": 0.015366957850368226
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kotzeje/lamini_docs.jsonl | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 573589
num_examples: 1400
download_size: 283465
dataset_size: 573589
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "lamini_docs.jsonl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/lmsys_chatbot_arena_conversations_gpt4_gpt35turbo_claudy | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: model_a_b
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
- name: model_name
dtype: string
splits:
- name: train
num_bytes: 17026152
num_examples: 12798
download_size: 8990072
dataset_size: 17026152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "lmsys_chatbot_arena_conversations_gpt4_gpt-3.5-turbo_claudy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Driver_Behavior_Collection_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Driver_Behavior_Collection_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/963?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
1,003 People-Driver Behavior Collection Data. The data includes multiple ages and multiple time periods. The driver behaviors includes Dangerous behavior, fatigue behavior and visual movement behavior. In terms of device, binocular cameras of RGB and infrared channels were applied. This data can be used for tasks such as driver behavior analysis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/963?source=Huggingface
### Supported Tasks and Leaderboards
face-detection, computer-vision: The dataset can be used to train a model for face detection.
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
pvduy/mix_gpt4_6k_camel_rlhf | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 125128244
num_examples: 25584
- name: test
num_bytes: 10976904
num_examples: 1621
download_size: 63346955
dataset_size: 136105148
---
# Dataset Card for "mix_gpt4_6k_camel_rlhf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juraj-juraj/python_googlestyle_docstrings | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: docstring
dtype: string
- name: function
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 25427503
num_examples: 27895
- name: validation
num_bytes: 1176962
num_examples: 1000
- name: test
num_bytes: 1016544
num_examples: 1000
download_size: 10592938
dataset_size: 27621009
---
|
atmallen/quirky_sciq_pythia-410m_bob_easy | ---
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: bob_log_odds
dtype: float64
splits:
- name: train
num_bytes: 3637062.50295402
num_examples: 5838
- name: validation
num_bytes: 304362.045
num_examples: 494
- name: test
num_bytes: 316343.916
num_examples: 504
download_size: 1395650
dataset_size: 4257768.46395402
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
deepghs/anime_teen | ---
license: mit
task_categories:
- image-classification
tags:
- art
- not-for-all-audiences
size_categories:
- 10K<n<100K
---
Because some American websites (such as civitai) have strict restrictions on the content of potential pedophiles (in fact, the management is quite broad and nonsensical, and any character pictures with younger characters may violate the rules), so we think It is necessary to collect such data and train a classification model to help content creators avoid this potential risk as much as possible.
The dataset contains the following labels:
* `contentious`, corresponding to the [contentious](https://beta.sankakucomplex.com/zh-CN/tag/en?tagName=contentious_content) tag on sankaku, contains pornographic content about loli, shota, etc.
* `safe_teen`, contains non-pornographic content about young children.
* `non_teen`, contains anything that is not a young boy or girl (whether safe, sexy or pornographic)
Please note:
* The above categories of content are for the visual level, and the actual age of the characters is not important (the same is often the case with the moderation rules of the website).
* The above datasets are obtained by crawling and do not guarantee extremely high purity. Therefore, it is recommended to use relevant noisy deep learning algorithms during training. |
distil-whisper/voxpopuli-timestamped | ---
license: cc0-1.0
task_categories:
- automatic-speech-recognition
language:
- en
-pretty_name: VoxPopuli
---
# Distil Whisper: VoxPopuli With Timestamps
This is a variant of the [VoxPopuli](https://huggingface.co/datasets/facebook/voxpopuli) dataset, augmented to return the pseudo-labelled Whisper
Transcriptions alongside the original dataset elements. The pseudo-labelled transcriptions were generated by
labelling the input audio data with the Whisper [large-v2](https://huggingface.co/openai/whisper-large-v2)
model with *greedy* sampling and timestamp prediction. For information on how the original dataset was curated, refer to the original
[dataset card](https://huggingface.co/datasets/facebook/voxpopuli).
## Standalone Usage
First, install the latest version of the 🤗 Datasets package:
```bash
pip install --upgrade pip
pip install --upgrade datasets[audio]
```
The dataset can be downloaded and pre-processed on disk using the [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.5/en/package_reference/loading_methods#datasets.load_dataset)
function:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/voxpopuli", "en")
# take the first sample of the validation set
sample = dataset["validation"][0]
```
It can also be streamed directly from the Hub using Datasets' [streaming mode](https://huggingface.co/blog/audio-datasets#streaming-mode-the-silver-bullet).
Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire
dataset to disk:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/voxpopuli", "en", streaming=True)
# take the first sample of the validation set
sample = next(iter(dataset["validation"]))
```
## Distil Whisper Usage
To use this dataset to reproduce a Distil Whisper training run, refer to the instructions on the
[Distil Whisper repository](https://github.com/huggingface/distil-whisper#training).
## License
This dataset is licensed under cc0-1.0.
|
ricahrd/Matue | ---
license: openrail
---
|
KELONMYOSA/dusha_emotion_audio | ---
task_categories:
- audio-classification
language:
- ru
size_categories:
- 100K<n<1M
pretty_name: Russian speech emotions
---
This dataset was taken from the creators [GitHub repository](https://github.com/salute-developers/golos/tree/master/dusha) and converted for my own studying needs.
# Dusha dataset
Dusha is a bi-modal corpus suitable for speech emotion recognition (SER) tasks. The dataset consists of about 300 000 audio recordings with Russian speech, their transcripts and emotional labels. The corpus contains approximately 350 hours of data. Four basic emotions that usually appear in a dialog with a virtual assistant were selected: Happiness (Positive), Sadness, Anger and Neutral emotion.
## **License**
[English Version](https://github.com/salute-developers/golos/blob/master/license/en_us.pdf)
[Russian Version](https://github.com/salute-developers/golos/blob/master/license/ru.pdf)
## **Authors**
- Artem Sokolov
- Fedor Minkin
- Nikita Savushkin
- Nikolay Karpov
- Oleg Kutuzov
- Vladimir Kondratenko |
substratusai/the-stack-yaml-k8s | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- multilingual
pretty_name: The-Stack
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids: []
extra_gated_prompt: |-
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
2. The Stack is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack to the most recent usable version specified by the maintainers in [the following thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If you have questions about dataset versions and allowed uses, please also ask them in the dataset’s [community discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new). We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must include [these Terms of Use](https://huggingface.co/datasets/bigcode/the-stack#terms-of-use-for-the-stack) and require users to agree to it.
By clicking on "Access repository" below, you accept that your contact information (email address and username) can be shared with the dataset maintainers as well.
extra_gated_fields:
Email: text
I have read the License and agree with its terms: checkbox
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 2056665435.7311056
num_examples: 276520
download_size: 312473618
dataset_size: 2056665435.7311056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for The Stack YAML K8s
This dataset is a subset of The Stack dataset data/yaml. The YAML files were
parsed and filtered out all valid K8s YAML files which is what this data is about.
The dataset contains 276520 valid K8s YAML files. The dataset was created by running
the [the-stack-yaml-k8s.ipynb](https://github.com/substratusai/the-stack-yaml-k8s/blob/main/the-stack-k8s-yaml.ipynb)
Notebook on K8s using [substratus.ai](https://substratus.ai)
Source code used to generate dataset: https://github.com/substratusai/the-stack-yaml-k8s
Need some help? Questions? Join our Discord server: <a href="https://discord.gg/JeXhcmjZVm"><img alt="discord-invite" src="https://dcbadge.vercel.app/api/server/JeXhcmjZVm?style=flat"></a>
### How to use it
```python
from datasets import load_dataset
ds = load_dataset("substratusai/the-stack-yaml-k8s", split="train")
ds[0]["content"]
```
## Original The Stack Dataset Description
- **Homepage:** https://www.bigcode-project.org/
- **Repository:** https://github.com/bigcode-project
- **Paper:** https://arxiv.org/abs/2211.15533
- **Leaderboard:** N/A
- **Point of Contact:** contact@bigcode-project.org
## Dataset Structure
### Data Instances
Each data instance corresponds to one file. The content of the file is in the `content` feature, and other features (`repository_name`, `licenses`, etc.) provide some metadata. Note that a given file can appear in several different repositories that satisfy our safe-license criterion. If that is the case, only the first – in alphabetical order -- of these repositories is shown for simplicity.
### Data Fields
- `content` (string): the content of the file.
- `size` (integer): size of the uncompressed file.
- `lang` (string): the programming language.
- `ext` (string): file extension
- `avg_line_length` (float): the average line-length of the file.
- `max_line_length` (integer): the maximum line-length of the file.
- `alphanum_fraction` (float): the fraction of characters in the file that are alphabetical or numerical characters.
- `hexsha` (string): unique git hash of file
- `max_{stars|forks|issues}_repo_path` (string): path to file in repo containing this file with maximum number of `{stars|forks|issues}`
- `max_{stars|forks|issues}_repo_name` (string): name of repo containing this file with maximum number of `{stars|forks|issues}`
- `max_{stars|forks|issues}_repo_head_hexsha` (string): hexsha of repository head
- `max_{stars|forks|issues}_repo_licenses` (string): licenses in repository
- `max_{stars|forks|issues}_count` (integer): number of `{stars|forks|issues}` in repository
- `max_{stars|forks|issues}_repo_{stars|forks|issues}_min_datetime` (string): first timestamp of a `{stars|forks|issues}` event
- `max_{stars|forks|issues}_repo_{stars|forks|issues}_max_datetime` (string): last timestamp of a `{stars|forks|issues}` event
### Data Splits
The dataset has no splits and all data is loaded as train split by default. If you want to setup a custom train-test split beware that dataset contains a lot of near-duplicates which can cause leakage into the test split.
## Dataset Creation
### Curation Rationale
One of the challenges faced by researchers working on code LLMs is the lack of openness and transparency around the development of these systems. Most prior works described the high-level data collection process but did not release the training data. It is therefore difficult for other researchers to fully reproduce these models and understand what kind of pre-training data leads to high-performing code LLMs. By releasing an open large-scale code dataset we hope to make training of code LLMs more reproducible.
### Source Data
#### Initial Data Collection and Normalization
220.92M active GitHub repository names were collected from the event archives published between January 1st, 2015 and March 31st, 2022 on [GHArchive](https://gharchive.org/). Only 137.36M of these repositories were public and accessible on GitHub – others were not accessible as they had been deleted by their owners. 51.76B files were downloaded from the public repositories on GitHub between November 2021 and June 2022. 5.28B files were unique. The uncompressed size of all stored files is 92.36TB.
The list of programming language extensions is taken from this [list](https://gist.github.com/ppisarczyk/43962d06686722d26d176fad46879d41) (also provided in Appendix C of the paper).
Near-deduplication was implemented in the pre-processing pipeline on top of exact deduplication. To find near-duplicates, MinHash with 256 permutations of all documents was computed in linear time. Locality Sensitive Hashing was used to find the clusters of duplicates. Jaccard Similarities were computed inside these clusters to remove any false positives and with a similarity threshold of 0.85. Roughly 40% of permissively licensed files were (near-)duplicates. See section 3 of the paper for further details.
The following are not stored:
- Files that cannot contribute to training code: binary, empty, could not be decoded
- Files larger than 1MB
- The excluded file extensions are listed in Appendix B of the paper.
##### License detection
Permissive licenses have minimal restrictions on how the software can be copied, modified, and redistributed. The full list of licenses can be found [here](https://huggingface.co/datasets/bigcode/the-stack-dedup/blob/main/licenses.json).
GHArchive contained the license information for approximately 12% of the collected repositories. For the remaining repositories, [go-license-detector](https://github.com/src-d/go-license-detector) was run to detect the most likely SPDX license identifier. The detector did not detect a license for ~81% of the repositories, in which case the repository was excluded from the dataset.
A file was included in the safe license dataset if at least one of the repositories containing the file had a permissive license.
#### Who are the source language producers?
The source (code) language producers are users of GitHub that created unique repository names between January 1st, 2015, and March 31st, 2022.
### Personal and Sensitive Information
The released dataset may contain sensitive information such as emails, IP addresses, and API/ssh keys that have previously been published to public repositories on GitHub. Deduplication has helped to reduce the amount of sensitive data that may exist. In the event that the dataset contains personal information, researchers should only use public, non-personal information in support of conducting and publishing their [open-access](https://en.wikipedia.org/wiki/Open_access) research. Personal information should not be used for spamming purposes, including sending unsolicited emails or selling of personal information. Complaints, removal requests, and "do not contact" requests can be sent to contact@bigcode-project.org.
The PII pipeline for this dataset is still a work in progress (see this [issue](https://github.com/bigcode-project/admin/issues/9) for updates). Researchers that wish to contribute to the anonymization pipeline of the project can apply to join [here](https://www.bigcode-project.org/docs/about/join/). Developers with source code in the dataset can request to have it removed [here](https://www.bigcode-project.org/docs/about/ip/) (proof of code contribution is required).
### Opting out of The Stack
We are giving developers the ability to have their code removed from the dataset upon request. The process for submitting and enacting removal requests will keep evolving throughout the project as we receive feedback and build up more data governance tools.
You can check if your code is in The Stack with the following ["Am I In The Stack?" Space](https://huggingface.co/spaces/bigcode/in-the-stack). If you'd like to have your data removed from the dataset follow the [instructions on GitHub](https://github.com/bigcode-project/opt-out-v2).
## Considerations for Using the Data
### Social Impact of Dataset
The Stack is an output of the BigCode Project. BigCode aims to be responsible by design and by default. The project is conducted in the spirit of Open Science, focused on the responsible development of LLMs for code.
With the release of The Stack, we aim to increase access, reproducibility, and transparency of code LLMs in the research community. Work to de-risk and improve on the implementation of ethical best practices of code LLMs is conducted in various BigCode working groups. The Legal, Ethics, and Governance working group has explored topics such as licensing (including copyleft and the intended use of permissively licensed code), attribution of generated code to original code, rights to restrict processing, the inclusion of Personally Identifiable Information (PII), and risks of malicious code, among other topics. This work is ongoing as of October 25th, 2022.
We expect code LLMs to enable people from diverse backgrounds to write higher quality code and develop low-code applications. Mission-critical software could become easier to maintain as professional developers are guided by code-generating systems on how to write more robust and efficient code. While the social impact is intended to be positive, the increased accessibility of code LLMs comes with certain risks such as over-reliance on the generated code and long-term effects on the software development job market.
A broader impact analysis relating to Code LLMs can be found in section 7 of this [paper](https://arxiv.org/abs/2107.03374). An in-depth risk assessments for Code LLMs can be found in section 4 of this [paper](https://arxiv.org/abs/2207.14157).
### Discussion of Biases
The code collected from GitHub does not contain demographic information or proxy information about the demographics. However, it is not without risks,
as the comments within the code may contain harmful or offensive language, which could be learned by the models.
Widely adopted programming languages like C and Javascript are overrepresented compared to niche programming languages like Julia and Scala. Some programming languages such as SQL, Batchfile, TypeScript are less likely to be permissively licensed (4% vs the average 10%). This may result in a biased representation of those languages. Permissively licensed files also tend to be longer.
Roughly 40 natural languages are present in docstrings and comments with English being the most prevalent. In python files, it makes up ~96% of the dataset.
For further information on data analysis of the Stack, see this [repo](https://github.com/bigcode-project/bigcode-analysis).
### Other Known Limitations
One of the current limitations of The Stack is that scraped HTML for websites may not be compliant with Web Content Accessibility Guidelines ([WCAG](https://www.w3.org/WAI/standards-guidelines/wcag/)). This could have an impact on HTML-generated code that may introduce web accessibility issues.
The training dataset could contain malicious code and/or the model could be used to generate malware or ransomware.
To the best of our knowledge, all files contained in the dataset are licensed with one of the permissive licenses (see list in [Licensing information](#licensing-information)). The accuracy of license attribution is limited by the accuracy of GHArchive and go-license-detector. Any mistakes should be reported to BigCode Project for review and follow-up as needed.
## Additional Information
### Dataset Curators
1. Harm de Vries, ServiceNow Research, harm.devries@servicenow.com
2. Leandro von Werra, Hugging Face, leandro@huggingface.co
### Licensing Information
The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
The list of [SPDX license identifiers](https://spdx.org/licenses/) included in the dataset can be found [here](https://huggingface.co/datasets/bigcode/the-stack/blob/main/licenses.json).
### Citation Information
```
@article{Kocetkov2022TheStack,
title={The Stack: 3 TB of permissively licensed source code},
author={Kocetkov, Denis and Li, Raymond and Ben Allal, Loubna and Li, Jia and Mou,Chenghao and Muñoz Ferrandis, Carlos and Jernite, Yacine and Mitchell, Margaret and Hughes, Sean and Wolf, Thomas and Bahdanau, Dzmitry and von Werra, Leandro and de Vries, Harm},
journal={Preprint},
year={2022}
}
```
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
2. The Stack is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack to the most recent usable version specified by the maintainers in [the following thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If you have questions about dataset versions and allowed uses, please also ask them in the dataset’s [community discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new). We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must include these Terms of Use and require users to agree to it.
|
W1lson/Book3 | ---
dataset_info:
features:
- name: Source ID
dtype: int64
- name: Primary Text
dtype: string
splits:
- name: train
num_bytes: 9831
num_examples: 87
download_size: 7549
dataset_size: 9831
---
# Dataset Card for "Book3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/code_instructions_standardized_cluster_11_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 23064508
num_examples: 7090
download_size: 12296169
dataset_size: 23064508
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_11_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/march_7th_starrail | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of march_7th/三月なのか/三月七/Mar. 7th (Honkai: Star Rail)
This is the dataset of march_7th/三月なのか/三月七/Mar. 7th (Honkai: Star Rail), containing 500 images and their tags.
The core tags of this character are `pink_hair, bangs, blue_eyes, breasts, hair_between_eyes, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/march_7th_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 492.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/march_7th_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1275 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/march_7th_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 898.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/march_7th_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1275 | 1.73 GiB | [Download](https://huggingface.co/datasets/CyberHarem/march_7th_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/march_7th_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, :d, looking_at_viewer, open_mouth, solo, white_shirt, long_sleeves, white_background, pink_eyes, simple_background, blue_jacket, choker, holding_camera, one_eye_closed, black_gloves, earrings |
| 1 | 17 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, solo, white_shirt, black_gloves, blue_jacket, open_mouth, :d, blue_skirt, earrings, single_glove, holding_camera, medium_hair, black_choker, multicolored_eyes, pink_eyes, partially_fingerless_gloves, teeth, one_eye_closed, pleated_skirt, sky |
| 2 | 6 |  |  |  |  |  | 1girl, black_footwear, blue_jacket, blue_skirt, full_body, long_sleeves, looking_at_viewer, open_mouth, solo, white_shirt, black_choker, shoes, :d, medium_hair, purple_eyes, holding_camera, one_eye_closed, pink_eyes, weapon |
| 3 | 45 |  |  |  |  |  | blush, 1girl, nipples, large_breasts, 1boy, hetero, pussy, navel, open_mouth, penis, sex, solo_focus, completely_nude, looking_at_viewer, vaginal, collarbone, mosaic_censoring, smile, sweat, heart-shaped_pupils, spread_legs |
| 4 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, detached_sleeves, smile, tiara, looking_at_viewer, multicolored_hair, solo, white_dress, upper_body, blue_hair, closed_mouth, multicolored_eyes, pink_eyes, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, dress, looking_at_viewer, solo, tiara, cleavage, long_sleeves, medium_breasts, open_mouth, white_thighhighs, :d, armpits, blue_hair, bow, blurry_background, blush, garter_straps, gradient_hair, medium_hair, nail_polish, pink_eyes, short_hair |
| 6 | 9 |  |  |  |  |  | navel, 1girl, looking_at_viewer, solo, blush, outdoors, blue_sky, choker, cleavage, cloud, collarbone, day, large_breasts, bare_shoulders, ocean, :d, beach, closed_mouth, holding, open_mouth, stomach, thigh_strap, white_bikini |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | :d | looking_at_viewer | open_mouth | solo | white_shirt | long_sleeves | white_background | pink_eyes | simple_background | blue_jacket | choker | holding_camera | one_eye_closed | black_gloves | earrings | blue_skirt | single_glove | medium_hair | black_choker | multicolored_eyes | partially_fingerless_gloves | teeth | pleated_skirt | sky | black_footwear | full_body | shoes | purple_eyes | weapon | blush | nipples | large_breasts | 1boy | hetero | pussy | navel | penis | sex | solo_focus | completely_nude | vaginal | collarbone | mosaic_censoring | smile | sweat | heart-shaped_pupils | spread_legs | bare_shoulders | detached_sleeves | tiara | multicolored_hair | white_dress | upper_body | blue_hair | closed_mouth | dress | cleavage | medium_breasts | white_thighhighs | armpits | bow | blurry_background | garter_straps | gradient_hair | nail_polish | short_hair | outdoors | blue_sky | cloud | day | ocean | beach | holding | stomach | thigh_strap | white_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----|:--------------------|:-------------|:-------|:--------------|:---------------|:-------------------|:------------|:--------------------|:--------------|:---------|:-----------------|:-----------------|:---------------|:-----------|:-------------|:---------------|:--------------|:---------------|:--------------------|:------------------------------|:--------|:----------------|:------|:-----------------|:------------|:--------|:--------------|:---------|:--------|:----------|:----------------|:-------|:---------|:--------|:--------|:--------|:------|:-------------|:------------------|:----------|:-------------|:-------------------|:--------|:--------|:----------------------|:--------------|:-----------------|:-------------------|:--------|:--------------------|:--------------|:-------------|:------------|:---------------|:--------|:-----------|:-----------------|:-------------------|:----------|:------|:--------------------|:----------------|:----------------|:--------------|:-------------|:-----------|:-----------|:--------|:------|:--------|:--------|:----------|:----------|:--------------|:---------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | | X | X | | | X | | X | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 45 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | | X | | | X | X | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | X | | | | | | X | | | | | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
sirfragles/mzpl | ---
license: unknown
---
|
VietnamAIHub/Vietnamese_Instruction_How_Step_by_Step | ---
license: creativeml-openrail-m
language:
- vi
size_categories:
- 10K<n<100K
--- |
lucassaicover/ALASTORBR | ---
license: openrail
---
|
open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16 | ---
pretty_name: Evaluation run of TheBloke/Llama-2-70B-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-70B-fp16](https://huggingface.co/TheBloke/Llama-2-70B-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T03:18:37.286787](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16/blob/main/results_2023-10-23T03-18-37.286787.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.00043200973460388544,\n \"f1\": 0.06615562080536916,\n\
\ \"f1_stderr\": 0.0013739852117668813,\n \"acc\": 0.5885312292623206,\n\
\ \"acc_stderr\": 0.011707750309504293\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460388544,\n\
\ \"f1\": 0.06615562080536916,\n \"f1_stderr\": 0.0013739852117668813\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33965125094768767,\n \
\ \"acc_stderr\": 0.01304504506766526\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343326\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-70B-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|arc:challenge|25_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T03_18_37.286787
path:
- '**/details_harness|drop|3_2023-10-23T03-18-37.286787.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T03-18-37.286787.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T03_18_37.286787
path:
- '**/details_harness|gsm8k|5_2023-10-23T03-18-37.286787.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T03-18-37.286787.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hellaswag|10_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T16:40:00.231770.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T16:40:00.231770.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T03_18_37.286787
path:
- '**/details_harness|winogrande|5_2023-10-23T03-18-37.286787.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T03-18-37.286787.parquet'
- config_name: results
data_files:
- split: 2023_07_31T16_40_00.231770
path:
- results_2023-07-31T16:40:00.231770.parquet
- split: 2023_10_23T03_18_37.286787
path:
- results_2023-10-23T03-18-37.286787.parquet
- split: latest
path:
- results_2023-10-23T03-18-37.286787.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-70B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-70B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-70B-fp16](https://huggingface.co/TheBloke/Llama-2-70B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T03:18:37.286787](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16/blob/main/results_2023-10-23T03-18-37.286787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.06615562080536916,
"f1_stderr": 0.0013739852117668813,
"acc": 0.5885312292623206,
"acc_stderr": 0.011707750309504293
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.06615562080536916,
"f1_stderr": 0.0013739852117668813
},
"harness|gsm8k|5": {
"acc": 0.33965125094768767,
"acc_stderr": 0.01304504506766526
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343326
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Tuzu/vozgabrielgava | ---
license: openrail
---
|
huaibovip/IXI_dataset_for_registration | ---
license: cc-by-sa-3.0
tags:
- medical
--- |
lzy337/attack_data_hf | ---
configs:
- config_name: default
data_files:
- split: train
path:
- toxicity/toxic.jsonl.gpt3.n=25.out1.split.annotated.jsonl.filtered_train.jsonl
- split: test
path:
- toxicity/toxic.jsonl.gpt3.n=25.out1.split.annotated.jsonl.filtered_test.jsonl
- split: dev
path:
- toxicity/toxic.jsonl.gpt3.n=25.out1.split.annotated.jsonl.filtered_dev.jsonl
---
Toxicity contail three types of data. 1. from realtoxicty prompt .2 response from gpt3.5 generation as prompt 3. same as 2 but it comes from gpt4
|
Sober-Clever/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 1420461
num_examples: 100
download_size: 513444
dataset_size: 1420461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsstein/100-percent-human-dataset-opt | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 86099199
num_examples: 15326
- name: test
num_bytes: 3058678
num_examples: 576
- name: validation
num_bytes: 3255787
num_examples: 576
download_size: 57143897
dataset_size: 92413664
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
tyzhu/squad_title_v4_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 203084
num_examples: 138
- name: validation
num_bytes: 50807
num_examples: 50
download_size: 65145
dataset_size: 253891
---
# Dataset Card for "squad_title_v4_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bbz662bbz/databricks-dolly-15k-ja-gozarinnemon | ---
license: cc-by-sa-3.0
---
This dataset was using "kunishou/databricks-dolly-15k-ja"
This dataset is licensed under CC BY SA 3.0
Last Update : 2023-05-28
databricks-dolly-15k-ja-gozarinnemon
kunishou/databricks-dolly-15k-ja
https://huggingface.co/datasets/kunishou/databricks-dolly-15k-ja
|
mteb/neuclir-2023 | ---
language:
- fas
- rus
- zho
multilinguality:
- multilingual
task_categories:
- text-retrieval
---
From the NeuCLIR TREC Track 2023: https://arxiv.org/abs/2304.12367
Generated from https://huggingface.co/datasets/neuclir/neuclir1
```
@article{lawrie2024overview,
title={Overview of the TREC 2023 NeuCLIR Track},
author={Lawrie, Dawn and MacAvaney, Sean and Mayfield, James and McNamee, Paul and Oard, Douglas W and Soldaini, Luca and Yang, Eugene},
url={https://trec.nist.gov/pubs/trec32/papers/Overview_neuclir.pdf},
year={2024}
}
```
|
Falah/fantasy_animal_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2645706
num_examples: 10000
download_size: 335130
dataset_size: 2645706
---
# Dataset Card for "fantasy_animal_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mitsuki_sonoda_sakuratrick | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mitsuki Sonoda
This is the dataset of Mitsuki Sonoda, containing 132 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 132 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 348 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 417 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 132 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 132 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 132 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 348 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 348 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 295 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 417 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 417 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
roa7n/patched_1000_test_p_150_m2_embeddings | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: features
sequence: float64
splits:
- name: train
num_bytes: 9275601628
num_examples: 1035692
download_size: 8812286870
dataset_size: 9275601628
---
# Dataset Card for "patched_1000_test_p_150_m2_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/bias_test | Invalid username or password. |
Birchlabs/openai-prm800k-stepwise-critic | ---
license: mit
---
|
CVasNLPExperiments/FGVC_Aircraft_test_google_flan_t5_xxl_mode_A_ns_200 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 84203
num_examples: 200
download_size: 17928
dataset_size: 84203
---
# Dataset Card for "FGVC_Aircraft_test_google_flan_t5_xxl_mode_A_ns_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gowitheflowlab/parallel-9 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 649296909.6590501
num_examples: 3322980
download_size: 428488796
dataset_size: 649296909.6590501
---
# Dataset Card for "parallel-9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Snoopy04/mmlu-sv-500 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: choices
sequence: string
splits:
- name: train
num_bytes: 2128.411306042885
num_examples: 5
- name: test
num_bytes: 216246.5886939571
num_examples: 508
download_size: 140225
dataset_size: 218375.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
financial_phrasebank | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- sentiment-classification
pretty_name: FinancialPhrasebank
dataset_info:
- config_name: sentences_allagree
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 303371
num_examples: 2264
download_size: 681890
dataset_size: 303371
- config_name: sentences_75agree
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 472703
num_examples: 3453
download_size: 681890
dataset_size: 472703
- config_name: sentences_66agree
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 587152
num_examples: 4217
download_size: 681890
dataset_size: 587152
- config_name: sentences_50agree
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 679240
num_examples: 4846
download_size: 681890
dataset_size: 679240
tags:
- finance
---
# Dataset Card for financial_phrasebank
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Kaggle](https://www.kaggle.com/ankurzing/sentiment-analysis-for-financial-news) [ResearchGate](https://www.researchgate.net/publication/251231364_FinancialPhraseBank-v10)
- **Repository:**
- **Paper:** [Arxiv](https://arxiv.org/abs/1307.5336)
- **Leaderboard:** [Kaggle](https://www.kaggle.com/ankurzing/sentiment-analysis-for-financial-news/code) [PapersWithCode](https://paperswithcode.com/sota/sentiment-analysis-on-financial-phrasebank) =
- **Point of Contact:** [Pekka Malo](mailto:pekka.malo@aalto.fi) [Ankur Sinha](mailto:ankur.sinha@aalto.fi)
### Dataset Summary
Polar sentiment dataset of sentences from financial news. The dataset consists of 4840 sentences from English language financial news categorised by sentiment. The dataset is divided by agreement rate of 5-8 annotators.
### Supported Tasks and Leaderboards
Sentiment Classification
### Languages
English
## Dataset Structure
### Data Instances
```
{ "sentence": "Pharmaceuticals group Orion Corp reported a fall in its third-quarter earnings that were hit by larger expenditures on R&D and marketing .",
"label": "negative"
}
```
### Data Fields
- sentence: a tokenized line from the dataset
- label: a label corresponding to the class as a string: 'positive', 'negative' or 'neutral'
### Data Splits
There's no train/validation/test split.
However the dataset is available in four possible configurations depending on the percentage of agreement of annotators:
`sentences_50agree`; Number of instances with >=50% annotator agreement: 4846
`sentences_66agree`: Number of instances with >=66% annotator agreement: 4217
`sentences_75agree`: Number of instances with >=75% annotator agreement: 3453
`sentences_allagree`: Number of instances with 100% annotator agreement: 2264
## Dataset Creation
### Curation Rationale
The key arguments for the low utilization of statistical techniques in
financial sentiment analysis have been the difficulty of implementation for
practical applications and the lack of high quality training data for building
such models. Especially in the case of finance and economic texts, annotated
collections are a scarce resource and many are reserved for proprietary use
only. To resolve the missing training data problem, we present a collection of
∼ 5000 sentences to establish human-annotated standards for benchmarking
alternative modeling techniques.
The objective of the phrase level annotation task was to classify each example
sentence into a positive, negative or neutral category by considering only the
information explicitly available in the given sentence. Since the study is
focused only on financial and economic domains, the annotators were asked to
consider the sentences from the view point of an investor only; i.e. whether
the news may have positive, negative or neutral influence on the stock price.
As a result, sentences which have a sentiment that is not relevant from an
economic or financial perspective are considered neutral.
### Source Data
#### Initial Data Collection and Normalization
The corpus used in this paper is made out of English news on all listed
companies in OMX Helsinki. The news has been downloaded from the LexisNexis
database using an automated web scraper. Out of this news database, a random
subset of 10,000 articles was selected to obtain good coverage across small and
large companies, companies in different industries, as well as different news
sources. Following the approach taken by Maks and Vossen (2010), we excluded
all sentences which did not contain any of the lexicon entities. This reduced
the overall sample to 53,400 sentences, where each has at least one or more
recognized lexicon entity. The sentences were then classified according to the
types of entity sequences detected. Finally, a random sample of ∼5000 sentences
was chosen to represent the overall news database.
#### Who are the source language producers?
The source data was written by various financial journalists.
### Annotations
#### Annotation process
This release of the financial phrase bank covers a collection of 4840
sentences. The selected collection of phrases was annotated by 16 people with
adequate background knowledge on financial markets.
Given the large number of overlapping annotations (5 to 8 annotations per
sentence), there are several ways to define a majority vote based gold
standard. To provide an objective comparison, we have formed 4 alternative
reference datasets based on the strength of majority agreement:
#### Who are the annotators?
Three of the annotators were researchers and the remaining 13 annotators were
master's students at Aalto University School of Business with majors primarily
in finance, accounting, and economics.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
All annotators were from the same institution and so interannotator agreement
should be understood with this taken into account.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/.
If you are interested in commercial use of the data, please contact the following authors for an appropriate license:
- [Pekka Malo](mailto:pekka.malo@aalto.fi)
- [Ankur Sinha](mailto:ankur.sinha@aalto.fi)
### Citation Information
```
@article{Malo2014GoodDO,
title={Good debt or bad debt: Detecting semantic orientations in economic texts},
author={P. Malo and A. Sinha and P. Korhonen and J. Wallenius and P. Takala},
journal={Journal of the Association for Information Science and Technology},
year={2014},
volume={65}
}
```
### Contributions
Thanks to [@frankier](https://github.com/frankier) for adding this dataset. |
WENGSYX/LMTuner-medical-v1 | ---
dataset_info:
features:
- name: conversations
sequence: string
- name: source
dtype: string
- name: version
dtype: string
splits:
- name: train
num_bytes: 163208205
num_examples: 65850
download_size: 103626649
dataset_size: 163208205
---
# Dataset Card for "Lingo-medical-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DangFutures__BIG_DANG_BOT | ---
pretty_name: Evaluation run of DangFutures/BIG_DANG_BOT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DangFutures/BIG_DANG_BOT](https://huggingface.co/DangFutures/BIG_DANG_BOT) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DangFutures__BIG_DANG_BOT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T10:23:33.414372](https://huggingface.co/datasets/open-llm-leaderboard/details_DangFutures__BIG_DANG_BOT/blob/main/results_2024-01-24T10-23-33.414372.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6873010049412217,\n\
\ \"acc_stderr\": 0.03039019909881743,\n \"acc_norm\": 0.700585054900533,\n\
\ \"acc_norm_stderr\": 0.03121364644444388,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4907419803847836,\n\
\ \"mc2_stderr\": 0.014683278149160121\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.01449442158425652,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180635\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6421031666998606,\n\
\ \"acc_stderr\": 0.0047840184976798185,\n \"acc_norm\": 0.8201553475403306,\n\
\ \"acc_norm_stderr\": 0.00383273101759212\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n\
\ \"acc_stderr\": 0.03972552884785136,\n \"acc_norm\": 0.6962962962962963,\n\
\ \"acc_norm_stderr\": 0.03972552884785136\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810536,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810536\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.028919802956134916,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.028919802956134916\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6425531914893617,\n \"acc_stderr\": 0.031329417894764254,\n\
\ \"acc_norm\": 0.6425531914893617,\n \"acc_norm_stderr\": 0.031329417894764254\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131796,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131796\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.832258064516129,\n \"acc_stderr\": 0.021255464065371318,\n \"\
acc_norm\": 0.832258064516129,\n \"acc_norm_stderr\": 0.021255464065371318\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5714285714285714,\n \"acc_stderr\": 0.03481904844438804,\n \"\
acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.025859164122051453,\n\
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.025859164122051453\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265594,\n \"\
acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265594\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878467,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878467\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n\
\ \"acc_stderr\": 0.012282876868629234,\n \"acc_norm\": 0.8633461047254151,\n\
\ \"acc_norm_stderr\": 0.012282876868629234\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071134,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071134\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n\
\ \"acc_stderr\": 0.016399716732847142,\n \"acc_norm\": 0.4022346368715084,\n\
\ \"acc_norm_stderr\": 0.016399716732847142\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.023929155517351305,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.023929155517351305\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.023222756797435098,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.023222756797435098\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.022021366100220197,\n\
\ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.022021366100220197\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.549645390070922,\n \"acc_stderr\": 0.029680105565029043,\n \
\ \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.029680105565029043\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.529986962190352,\n\
\ \"acc_stderr\": 0.012747248967079036,\n \"acc_norm\": 0.529986962190352,\n\
\ \"acc_norm_stderr\": 0.012747248967079036\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7720588235294118,\n \"acc_stderr\": 0.0254830814680298,\n\
\ \"acc_norm\": 0.7720588235294118,\n \"acc_norm_stderr\": 0.0254830814680298\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7761437908496732,\n \"acc_stderr\": 0.016863008585416613,\n \
\ \"acc_norm\": 0.7761437908496732,\n \"acc_norm_stderr\": 0.016863008585416613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4907419803847836,\n\
\ \"mc2_stderr\": 0.014683278149160121\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510423\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/DangFutures/BIG_DANG_BOT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|arc:challenge|25_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|gsm8k|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hellaswag|10_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T10-23-33.414372.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T10-23-33.414372.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- '**/details_harness|winogrande|5_2024-01-24T10-23-33.414372.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T10-23-33.414372.parquet'
- config_name: results
data_files:
- split: 2024_01_24T10_23_33.414372
path:
- results_2024-01-24T10-23-33.414372.parquet
- split: latest
path:
- results_2024-01-24T10-23-33.414372.parquet
---
# Dataset Card for Evaluation run of DangFutures/BIG_DANG_BOT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DangFutures/BIG_DANG_BOT](https://huggingface.co/DangFutures/BIG_DANG_BOT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DangFutures__BIG_DANG_BOT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T10:23:33.414372](https://huggingface.co/datasets/open-llm-leaderboard/details_DangFutures__BIG_DANG_BOT/blob/main/results_2024-01-24T10-23-33.414372.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6873010049412217,
"acc_stderr": 0.03039019909881743,
"acc_norm": 0.700585054900533,
"acc_norm_stderr": 0.03121364644444388,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4907419803847836,
"mc2_stderr": 0.014683278149160121
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.01449442158425652,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180635
},
"harness|hellaswag|10": {
"acc": 0.6421031666998606,
"acc_stderr": 0.0047840184976798185,
"acc_norm": 0.8201553475403306,
"acc_norm_stderr": 0.00383273101759212
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785136,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785136
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810536,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810536
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.028919802956134916,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.028919802956134916
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6425531914893617,
"acc_stderr": 0.031329417894764254,
"acc_norm": 0.6425531914893617,
"acc_norm_stderr": 0.031329417894764254
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.025699352832131796,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.025699352832131796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.832258064516129,
"acc_stderr": 0.021255464065371318,
"acc_norm": 0.832258064516129,
"acc_norm_stderr": 0.021255464065371318
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.025859164122051453,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.025859164122051453
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265594,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265594
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878467,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878467
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8633461047254151,
"acc_stderr": 0.012282876868629234,
"acc_norm": 0.8633461047254151,
"acc_norm_stderr": 0.012282876868629234
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071134,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4022346368715084,
"acc_stderr": 0.016399716732847142,
"acc_norm": 0.4022346368715084,
"acc_norm_stderr": 0.016399716732847142
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.023929155517351305,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.023929155517351305
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435098,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435098
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.022021366100220197,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.022021366100220197
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.549645390070922,
"acc_stderr": 0.029680105565029043,
"acc_norm": 0.549645390070922,
"acc_norm_stderr": 0.029680105565029043
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.529986962190352,
"acc_stderr": 0.012747248967079036,
"acc_norm": 0.529986962190352,
"acc_norm_stderr": 0.012747248967079036
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7720588235294118,
"acc_stderr": 0.0254830814680298,
"acc_norm": 0.7720588235294118,
"acc_norm_stderr": 0.0254830814680298
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7761437908496732,
"acc_stderr": 0.016863008585416613,
"acc_norm": 0.7761437908496732,
"acc_norm_stderr": 0.016863008585416613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4907419803847836,
"mc2_stderr": 0.014683278149160121
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510423
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/jingliu_starrail | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jingliu/鏡流/镜流/경류 (Honkai: Star Rail)
This is the dataset of jingliu/鏡流/镜流/경류 (Honkai: Star Rail), containing 500 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, red_eyes, white_hair, hair_between_eyes, very_long_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.20 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 544.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1309 | 1.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 995.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1309 | 1.83 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jingliu_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jingliu_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, closed_mouth, looking_at_viewer, solo, black_gloves, detached_sleeves, black_dress, cleavage, ponytail |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, earrings, looking_at_viewer, ponytail, solo, upper_body, closed_mouth, dress, cleavage, detached_sleeves, small_breasts, artist_name, hair_ribbon, medium_breasts |
| 2 | 14 |  |  |  |  |  | 1girl, solo, holding_sword, looking_at_viewer, bare_shoulders, black_gloves, full_moon, night, blue_dress, cleavage, medium_breasts, closed_mouth, grey_hair, ribbon, parted_lips, sky |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, dress, looking_at_viewer, solo, black_gloves, boots, holding_sword, armor, closed_mouth, parted_lips |
| 4 | 8 |  |  |  |  |  | 1girl, bare_shoulders, solo, blue_dress, looking_at_viewer, medium_breasts, black_gloves, parted_lips, cleavage, bare_legs, barefoot, detached_sleeves, elbow_gloves, feet, full_body, grey_hair, sitting, toes, hair_over_one_eye, jewelry, moon |
| 5 | 5 |  |  |  |  |  | 1girl, black_footwear, knee_boots, looking_at_viewer, solo, bare_shoulders, black_gloves, full_body, high_heel_boots, medium_breasts, sitting, blue_dress, closed_mouth, detached_sleeves, hair_ribbon, simple_background, thighs, white_background, grey_hair, hair_over_one_eye, knee_up, large_breasts, white_skirt |
| 6 | 10 |  |  |  |  |  | 1girl, blush, completely_nude, navel, nipples, solo, closed_mouth, collarbone, large_breasts, blue_hair, hair_ribbon, looking_at_viewer, simple_background, white_background, medium_breasts, armpits, blue_ribbon, pussy |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, cowgirl_position, hetero, mosaic_censoring, navel, penis, pussy, solo_focus, blush, girl_on_top, large_breasts, nipples, pov, sex, vaginal, grey_hair, open_mouth, bare_shoulders, blindfold, completely_nude, cum, detached_sleeves, earrings, looking_at_viewer, night_sky, ribbon, smile, star_(sky), sweat, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | closed_mouth | looking_at_viewer | solo | black_gloves | detached_sleeves | black_dress | cleavage | ponytail | earrings | upper_body | dress | small_breasts | artist_name | hair_ribbon | medium_breasts | holding_sword | full_moon | night | blue_dress | grey_hair | ribbon | parted_lips | sky | boots | armor | bare_legs | barefoot | elbow_gloves | feet | full_body | sitting | toes | hair_over_one_eye | jewelry | moon | black_footwear | knee_boots | high_heel_boots | simple_background | thighs | white_background | knee_up | large_breasts | white_skirt | blush | completely_nude | navel | nipples | collarbone | blue_hair | armpits | blue_ribbon | pussy | 1boy | cowgirl_position | hetero | mosaic_censoring | penis | solo_focus | girl_on_top | pov | sex | vaginal | open_mouth | blindfold | cum | night_sky | smile | star_(sky) | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------------|:-------|:---------------|:-------------------|:--------------|:-----------|:-----------|:-----------|:-------------|:--------|:----------------|:--------------|:--------------|:-----------------|:----------------|:------------|:--------|:-------------|:------------|:---------|:--------------|:------|:--------|:--------|:------------|:-----------|:---------------|:-------|:------------|:----------|:-------|:--------------------|:----------|:-------|:-----------------|:-------------|:------------------|:--------------------|:---------|:-------------------|:----------|:----------------|:--------------|:--------|:------------------|:--------|:----------|:-------------|:------------|:----------|:--------------|:--------|:-------|:-------------------|:---------|:-------------------|:--------|:-------------|:--------------|:------|:------|:----------|:-------------|:------------|:------|:------------|:--------|:-------------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | X | | | | | X | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | | X | X | X | X | | X | | | | | | | | X | | | | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | X | X | | | | X | X | | | | | | | | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | X | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | X | | | X | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | | | X | | X | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
jjzha/fijo | ---
license: cc-by-nc-sa-4.0
language: fr
---
This is the skill dataset created by:
```
@article{beauchemin-2022-fijo,
author = {Beauchemin, David and Laumonier, Julien and Ster, Yvan Le and Yassine, Marouane},
journal = {Proceedings of the Canadian Conference on Artificial Intelligence},
year = {2022},
month = {may 27},
note = {https://caiac.pubpub.org/pub/72bhunl6},
publisher = {Canadian Artificial Intelligence Association (CAIAC)},
title = {``{FIJO}'': a {French} {Insurance} {Soft} {Skill} {Detection} {Dataset}},
}
```
There are no document delimiters.
Number of samples (sentences):
- train: 399
- dev: 49
- test: 49
Sources:
- This dataset was collected as part of the multidisciplinary project Femmes face aux défis de la transformation numérique : une étude de cas dans le secteur des assurances (Women Facing the Challenges of Digital Transformation: A Case Study in the Insurance Sector) at Université Laval, funded by the Future Skills Centre. It includes job offers, in French, from insurance companies between 2009 and 2020.
Type of tags:
- BIO tags in `tags_skill` with fine-grained labels:
- PENSEE: thoughts
- RESULTATS: results
- RELATIONNEL: relational
- PERSONNEL: personal
Sample:
```
{
"idx": 47, "tokens": ["-", "Sens", "de", "l\u2019analyse", "\u00e9coute", "et", "minutie", "de", "transcription", "des", "informations", "-", "Professionnalisme", "vu", "le", "recueillement", "d'informations", "souvent", "d\u00e9licates."],
"tags_skill": ["O", "B-PENSEE", "I-PENSEE", "I-PENSEE", "B-RELATIONNEL", "O", "B-PERSONNEL", "I-PERSONNEL", "I-PERSONNEL", "I-PERSONNEL", "I-PERSONNEL", "O", "B-PERSONNEL", "O", "O", "B-RELATIONNEL", "I-RELATIONNEL", "I-RELATIONNEL", "I-RELATIONNEL"]
}
``` |
YBXL/JAMA_Reasoning_test_Common_cot_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 546518
num_examples: 249
- name: valid
num_bytes: 546518
num_examples: 249
- name: test
num_bytes: 546518
num_examples: 249
download_size: 848694
dataset_size: 1639554
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Ransaka/sinhala_synthetic_ocr-large | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 460221761.41
num_examples: 6969
download_size: 456093365
dataset_size: 460221761.41
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- si
---
If you use this data in publications, please cite it as follows:
```
@misc {ransaka_ravihara_2024,
author = { {Ransaka Ravihara} },
title = { sinhala_synthetic_ocr-large (Revision f3cac3b) },
year = 2024,
url = { https://huggingface.co/datasets/Ransaka/sinhala_synthetic_ocr-large },
doi = { 10.57967/hf/1809 },
publisher = { Hugging Face }
}
``` |
open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85 | ---
pretty_name: Evaluation run of uukuguy/SynthIA-7B-v1.3-dare-0.85
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/SynthIA-7B-v1.3-dare-0.85](https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T22:59:57.395887](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public/blob/main/results_2023-11-23T22-59-57.395887.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6384101997004026,\n\
\ \"acc_stderr\": 0.0320658451939497,\n \"acc_norm\": 0.6475312994622042,\n\
\ \"acc_norm_stderr\": 0.032755008534067175,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4377418572010016,\n\
\ \"mc2_stderr\": 0.014257418960086683,\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.0004445109990558977,\n \"f1\": 0.06350356543624144,\n\
\ \"f1_stderr\": 0.0013999691906909637\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892893\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6336387173869747,\n\
\ \"acc_stderr\": 0.004808251269682433,\n \"acc_norm\": 0.8349930292770364,\n\
\ \"acc_norm_stderr\": 0.00370428239078172\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.0252798503974049,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.0252798503974049\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029584,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029584\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406953,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406953\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407006,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407006\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559802,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4377418572010016,\n\
\ \"mc2_stderr\": 0.014257418960086683\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710686\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \
\ \"em_stderr\": 0.0004445109990558977,\n \"f1\": 0.06350356543624144,\n\
\ \"f1_stderr\": 0.0013999691906909637\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.18574677786201668,\n \"acc_stderr\": 0.010712298902729095\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|arc:challenge|25_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|drop|3_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|gsm8k|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hellaswag|10_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T22-59-57.395887.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- '**/details_harness|winogrande|5_2023-11-23T22-59-57.395887.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T22-59-57.395887.parquet'
- config_name: results
data_files:
- split: 2023_11_23T22_59_57.395887
path:
- results_2023-11-23T22-59-57.395887.parquet
- split: latest
path:
- results_2023-11-23T22-59-57.395887.parquet
---
# Dataset Card for Evaluation run of uukuguy/SynthIA-7B-v1.3-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/SynthIA-7B-v1.3-dare-0.85](https://huggingface.co/uukuguy/SynthIA-7B-v1.3-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T22:59:57.395887](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__SynthIA-7B-v1.3-dare-0.85_public/blob/main/results_2023-11-23T22-59-57.395887.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6384101997004026,
"acc_stderr": 0.0320658451939497,
"acc_norm": 0.6475312994622042,
"acc_norm_stderr": 0.032755008534067175,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4377418572010016,
"mc2_stderr": 0.014257418960086683,
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558977,
"f1": 0.06350356543624144,
"f1_stderr": 0.0013999691906909637
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892893
},
"harness|hellaswag|10": {
"acc": 0.6336387173869747,
"acc_stderr": 0.004808251269682433,
"acc_norm": 0.8349930292770364,
"acc_norm_stderr": 0.00370428239078172
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.0252798503974049,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.0252798503974049
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029584,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406953,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406953
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407006,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407006
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559802,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4377418572010016,
"mc2_stderr": 0.014257418960086683
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710686
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558977,
"f1": 0.06350356543624144,
"f1_stderr": 0.0013999691906909637
},
"harness|gsm8k|5": {
"acc": 0.18574677786201668,
"acc_stderr": 0.010712298902729095
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/agir_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of agir/エーギル/埃吉尔 (Azur Lane)
This is the dataset of agir/エーギル/埃吉尔 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, multicolored_hair, red_hair, horns, streaked_hair, white_hair, large_breasts, yellow_eyes, demon_horns, very_long_hair, hair_between_eyes, two-tone_hair, bangs, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1012.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agir_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 473.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agir_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1344 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/agir_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 839.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agir_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1344 | 1.58 GiB | [Download](https://huggingface.co/datasets/CyberHarem/agir_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/agir_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, bare_shoulders, bodystocking, breast_curtains, iron_cross, looking_at_viewer, solo, cross-laced_clothes, cross_earrings, hair_on_horn, underbust, black_cape, black_gloves, elbow_gloves, covered_navel |
| 1 | 5 |  |  |  |  |  | 1girl, armored_boots, asymmetrical_footwear, bare_shoulders, black_cape, black_gloves, bodystocking, breast_curtains, cross-laced_clothes, cross_earrings, hair_on_horn, iron_cross, sitting, solo, underbust, knee_boots, looking_at_viewer, non-humanoid_robot, rigging, elbow_gloves, full_body, high_heel_boots, turret, black_footwear |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, cleavage, dress, garter_straps, looking_at_viewer, sitting, solo, thighs, black_thighhighs, jewelry, parted_lips, smile, smoke, blush, brown_thighhighs, feather_boa, crossed_legs, feet, holding_smoking_pipe, no_shoes |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, black_thighhighs, cleavage, garter_straps, looking_at_viewer, no_shoes, sitting, solo, toes, dress, fine_fabric_emphasis, foot_focus, holding, parted_lips, soles, thighs, chinese_clothes, jewelry, official_alternate_costume, legs, smile |
| 4 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, maid_headdress, official_alternate_costume, solo, thighs, white_thighhighs, ass, garter_straps, underboob, white_panties, blush, no_shoes, couch, on_side, open_mouth, frilled_hairband, skirt |
| 5 | 5 |  |  |  |  |  | 1girl, arm_garter, frilled_hairband, full_body, garter_straps, looking_at_viewer, maid_headdress, no_shoes, official_alternate_costume, on_couch, solo, white_thighhighs, ass, bare_shoulders, breast_rest, indoors, soles, thighs, underbutt, apron, feet_up, open_mouth, see-through_legwear, underboob, black_dress, lamp, legs, the_pose, tongue_out, window, wooden_floor |
| 6 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, maid_headdress, official_alternate_costume, solo, white_thighhighs, apron, cleavage, frilled_hairband, garter_straps |
| 7 | 20 |  |  |  |  |  | looking_at_viewer, navel, 1girl, solo, black_bikini, cleavage, alternate_costume, outdoors, bare_shoulders, blue_sky, day, blush |
| 8 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, open_mouth, completely_nude, navel, sex, solo_focus, sweat, vaginal, ahegao, cowgirl_position, cum_in_pussy, jewelry, looking_at_viewer, penis, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | bodystocking | breast_curtains | iron_cross | looking_at_viewer | solo | cross-laced_clothes | cross_earrings | hair_on_horn | underbust | black_cape | black_gloves | elbow_gloves | covered_navel | armored_boots | asymmetrical_footwear | sitting | knee_boots | non-humanoid_robot | rigging | full_body | high_heel_boots | turret | black_footwear | cleavage | dress | garter_straps | thighs | black_thighhighs | jewelry | parted_lips | smile | smoke | blush | brown_thighhighs | feather_boa | crossed_legs | feet | holding_smoking_pipe | no_shoes | toes | fine_fabric_emphasis | foot_focus | holding | soles | chinese_clothes | official_alternate_costume | legs | maid_headdress | white_thighhighs | ass | underboob | white_panties | couch | on_side | open_mouth | frilled_hairband | skirt | arm_garter | on_couch | breast_rest | indoors | underbutt | apron | feet_up | see-through_legwear | black_dress | lamp | the_pose | tongue_out | window | wooden_floor | navel | black_bikini | alternate_costume | outdoors | blue_sky | day | 1boy | hetero | nipples | completely_nude | sex | solo_focus | sweat | vaginal | ahegao | cowgirl_position | cum_in_pussy | penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:------------------|:-------------|:--------------------|:-------|:----------------------|:-----------------|:---------------|:------------|:-------------|:---------------|:---------------|:----------------|:----------------|:------------------------|:----------|:-------------|:---------------------|:----------|:------------|:------------------|:---------|:-----------------|:-----------|:--------|:----------------|:---------|:-------------------|:----------|:--------------|:--------|:--------|:--------|:-------------------|:--------------|:---------------|:-------|:-----------------------|:-----------|:-------|:-----------------------|:-------------|:----------|:--------|:------------------|:-----------------------------|:-------|:-----------------|:-------------------|:------|:------------|:----------------|:--------|:----------|:-------------|:-------------------|:--------|:-------------|:-----------|:--------------|:----------|:------------|:--------|:----------|:----------------------|:--------------|:-------|:-----------|:-------------|:---------|:---------------|:--------|:---------------|:--------------------|:-----------|:-----------|:------|:-------|:---------|:----------|:------------------|:------|:-------------|:--------|:----------|:---------|:-------------------|:---------------|:--------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | | | X | X | | | | | | X | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | | | X | X | | | | | | X | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 20 |  |  |  |  |  | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | X | | | | | | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | | | X | X | | | | | | | | | | | | | | | X | | | | | | X | X | | | | | | | | | | | | X | | | | | X | | X | X | X | X | X | X | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | | X | X | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | X | | X | X | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 20 |  |  |  |  |  | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
imvbhuvan/demo | ---
license: mit
---
|
pccl-org/formal-logic-simple-order-multi-token-fixed-objects-paired-relationship-0-100 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 1434312
num_examples: 4950
download_size: 145456
dataset_size: 1434312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adityarra07/train_1000_2 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 133278697.26379317
num_examples: 1000
- name: test
num_bytes: 26655739.452758636
num_examples: 200
download_size: 164191192
dataset_size: 159934436.7165518
---
# Dataset Card for "train_1000_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quangss2410/haha | ---
license: openrail
---
|
CyberHarem/kirin_r_yato_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kirin_r_yato/キリンRヤトウ/麒麟R夜刀 (Arknights)
This is the dataset of kirin_r_yato/キリンRヤトウ/麒麟R夜刀 (Arknights), containing 80 images and their tags.
The core tags of this character are `horns, long_hair, breasts, brown_hair, blue_eyes, multicolored_hair, hair_between_eyes, fake_horns, white_hair, pointy_ears, large_breasts, mole, mole_under_eye, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 80 | 182.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 80 | 148.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 213 | 297.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirin_r_yato_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kirin_r_yato_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, bare_shoulders, kirin_(armor), midriff, navel, solo, stomach, looking_at_viewer, black_belt, cleavage, necklace, black_gloves, fur_trim, simple_background, thighhighs, garter_straps, single_detached_sleeve, cowboy_shot, white_background, crop_top, holding_weapon, belt_buckle, standing, smile, hairband, pendant, skirt |
| 1 | 11 |  |  |  |  |  | 1girl, bare_shoulders, kirin_(armor), midriff, navel, necklace, solo, stomach, cleavage, black_belt, black_gloves, cowboy_shot, looking_at_viewer, white_background, crop_top, standing, fur_trim, hairband, pendant, simple_background, single_horn, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | kirin_(armor) | midriff | navel | solo | stomach | looking_at_viewer | black_belt | cleavage | necklace | black_gloves | fur_trim | simple_background | thighhighs | garter_straps | single_detached_sleeve | cowboy_shot | white_background | crop_top | holding_weapon | belt_buckle | standing | smile | hairband | pendant | skirt | single_horn | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:----------------|:----------|:--------|:-------|:----------|:--------------------|:-------------|:-----------|:-----------|:---------------|:-----------|:--------------------|:-------------|:----------------|:-------------------------|:--------------|:-------------------|:-----------|:-----------------|:--------------|:-----------|:--------|:-----------|:----------|:--------|:--------------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | | | X | | X | X | | X | X |
|
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a16 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r128_a16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r128_a16](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r128_a16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T01:06:53.284572](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a16/blob/main/results_2024-02-10T01-06-53.284572.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5553516912928418,\n\
\ \"acc_stderr\": 0.03366093927931328,\n \"acc_norm\": 0.561202247356678,\n\
\ \"acc_norm_stderr\": 0.034381877649567884,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.38216302938189795,\n\
\ \"mc2_stderr\": 0.013788037888201266\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5622866894197952,\n \"acc_stderr\": 0.014497573881108287,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.01432225579071987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.616211909978092,\n\
\ \"acc_stderr\": 0.004853134271547768,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.0038076803311729037\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n\
\ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n\
\ \"acc_stderr\": 0.04280105837364396,\n \"acc_norm\": 0.24509803921568626,\n\
\ \"acc_norm_stderr\": 0.04280105837364396\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n\
\ \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n\
\ \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"\
acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127152,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127152\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\
\ \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.6903225806451613,\n\
\ \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547815,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547815\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648372,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n\
\ \"acc_stderr\": 0.015594955384455765,\n \"acc_norm\": 0.7445721583652618,\n\
\ \"acc_norm_stderr\": 0.015594955384455765\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475358,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475358\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557308,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557308\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n\
\ \"acc_stderr\": 0.012593959992906422,\n \"acc_norm\": 0.4172099087353325,\n\
\ \"acc_norm_stderr\": 0.012593959992906422\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.02009508315457734,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.02009508315457734\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.38216302938189795,\n\
\ \"mc2_stderr\": 0.013788037888201266\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838236\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23881728582259287,\n \
\ \"acc_stderr\": 0.011744097081003805\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r128_a16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-06-53.284572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-06-53.284572.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- '**/details_harness|winogrande|5_2024-02-10T01-06-53.284572.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T01-06-53.284572.parquet'
- config_name: results
data_files:
- split: 2024_02_10T01_06_53.284572
path:
- results_2024-02-10T01-06-53.284572.parquet
- split: latest
path:
- results_2024-02-10T01-06-53.284572.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r128_a16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r128_a16](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r128_a16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T01:06:53.284572](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a16/blob/main/results_2024-02-10T01-06-53.284572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5553516912928418,
"acc_stderr": 0.03366093927931328,
"acc_norm": 0.561202247356678,
"acc_norm_stderr": 0.034381877649567884,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.38216302938189795,
"mc2_stderr": 0.013788037888201266
},
"harness|arc:challenge|25": {
"acc": 0.5622866894197952,
"acc_stderr": 0.014497573881108287,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.01432225579071987
},
"harness|hellaswag|10": {
"acc": 0.616211909978092,
"acc_stderr": 0.004853134271547768,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.0038076803311729037
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127152,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127152
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512566,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512566
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547815,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547815
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648372,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.015594955384455765,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.015594955384455765
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475358,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475358
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557308,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557308
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906422,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906422
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.02009508315457734,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.02009508315457734
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.38216302938189795,
"mc2_stderr": 0.013788037888201266
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838236
},
"harness|gsm8k|5": {
"acc": 0.23881728582259287,
"acc_stderr": 0.011744097081003805
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_sequelbox__DaringFortitude | ---
pretty_name: Evaluation run of sequelbox/DaringFortitude
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sequelbox/DaringFortitude](https://huggingface.co/sequelbox/DaringFortitude)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__DaringFortitude_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T00:35:47.431209](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__DaringFortitude_public/blob/main/results_2023-11-15T00-35-47.431209.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5932217761298214,\n\
\ \"acc_stderr\": 0.03305656216343214,\n \"acc_norm\": 0.6027951864354921,\n\
\ \"acc_norm_stderr\": 0.03382034227909779,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.559561930249219,\n\
\ \"mc2_stderr\": 0.015693079433704838,\n \"em\": 0.01950503355704698,\n\
\ \"em_stderr\": 0.0014162361849700607,\n \"f1\": 0.12218750000000013,\n\
\ \"f1_stderr\": 0.002284380268622334\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.01429651302018063,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6360286795459071,\n\
\ \"acc_stderr\": 0.004801572028920796,\n \"acc_norm\": 0.8355905198167696,\n\
\ \"acc_norm_stderr\": 0.003698892388380099\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.017493922404112648,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.017493922404112648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.02557412378654667,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.02557412378654667\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.01671246744170252,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.01671246744170252\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.012725701656953642,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.012725701656953642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.01988622103750187,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.01988622103750187\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.017168830935187215,\n \"mc2\": 0.559561930249219,\n\
\ \"mc2_stderr\": 0.015693079433704838\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650865\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.01950503355704698,\n \
\ \"em_stderr\": 0.0014162361849700607,\n \"f1\": 0.12218750000000013,\n\
\ \"f1_stderr\": 0.002284380268622334\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.08794541319181198,\n \"acc_stderr\": 0.007801162197487721\n\
\ }\n}\n```"
repo_url: https://huggingface.co/sequelbox/DaringFortitude
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|arc:challenge|25_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|drop|3_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|gsm8k|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hellaswag|10_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T00-35-47.431209.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- '**/details_harness|winogrande|5_2023-11-15T00-35-47.431209.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T00-35-47.431209.parquet'
- config_name: results
data_files:
- split: 2023_11_15T00_35_47.431209
path:
- results_2023-11-15T00-35-47.431209.parquet
- split: latest
path:
- results_2023-11-15T00-35-47.431209.parquet
---
# Dataset Card for Evaluation run of sequelbox/DaringFortitude
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sequelbox/DaringFortitude
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sequelbox/DaringFortitude](https://huggingface.co/sequelbox/DaringFortitude) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sequelbox__DaringFortitude_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T00:35:47.431209](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__DaringFortitude_public/blob/main/results_2023-11-15T00-35-47.431209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5932217761298214,
"acc_stderr": 0.03305656216343214,
"acc_norm": 0.6027951864354921,
"acc_norm_stderr": 0.03382034227909779,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.559561930249219,
"mc2_stderr": 0.015693079433704838,
"em": 0.01950503355704698,
"em_stderr": 0.0014162361849700607,
"f1": 0.12218750000000013,
"f1_stderr": 0.002284380268622334
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.01429651302018063,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6360286795459071,
"acc_stderr": 0.004801572028920796,
"acc_norm": 0.8355905198167696,
"acc_norm_stderr": 0.003698892388380099
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.017493922404112648,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.017493922404112648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.02557412378654667,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.02557412378654667
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.01671246744170252,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.01671246744170252
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.012725701656953642,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.012725701656953642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.01988622103750187,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.01988622103750187
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.017168830935187215,
"mc2": 0.559561930249219,
"mc2_stderr": 0.015693079433704838
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650865
},
"harness|drop|3": {
"em": 0.01950503355704698,
"em_stderr": 0.0014162361849700607,
"f1": 0.12218750000000013,
"f1_stderr": 0.002284380268622334
},
"harness|gsm8k|5": {
"acc": 0.08794541319181198,
"acc_stderr": 0.007801162197487721
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
chronbmm/sandhi-split-long-2018 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: unsandhied
dtype: string
splits:
- name: train
num_bytes: 58896572
num_examples: 109152
- name: validation
num_bytes: 6548762
num_examples: 12128
- name: test
num_bytes: 6548762
num_examples: 12128
- name: test_500
num_bytes: 273816
num_examples: 500
- name: validation_500
num_bytes: 273816
num_examples: 500
download_size: 46961402
dataset_size: 72541728
---
# Dataset Card for "sandhi-split-long-2018"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_same_length_find_passage_train100_eval10_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 63274
num_examples: 210
- name: validation
num_bytes: 3262
num_examples: 10
download_size: 33107
dataset_size: 66536
---
# Dataset Card for "random_letter_same_length_find_passage_train100_eval10_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manojdec25/diamond-price-predictor-logs2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Nexdata/40_People_3D_2D_Living_Face_Anti_Spoofing_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
40 People – 3D&2D Living_Face & Anti_Spoofing Data. The collection scenes are indoor scenes and outdoor scenes. The dataset includes males and females, the age distribution is 18-57 years old. The device includes cellphone, camera, iPhone of multiple models (iPhone X or more advanced iPhone models). The data diversity includes multiple devices, multiple actions, multiple facial postures, multiple anti-spoofing samples, multiple light conditions, multiple scenes. This data can be used for tasks such as 2D Living_Face & Anti_Spoofing, 2D face recognition, 3D face recognition, 3D Living_Face & Anti_Spoofing.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1198?source=Huggingface
# Specifications
## Data size
40 people, 48 videos and 150 groups (252 images) for each person
## Population distribution
race distribution: Asian; gender distribution: 20 males, 20 females; age distribution: range from 18 to 57
## Collecting environment
20 people in indoor scenes, 20 people in outdoor scenes
## Data diversity
multiple devices, multiple actions, multiple facial postures, multiple anti-spoofing samples, multiple light conditions, multiple scenes
## Device
cellphone, camera, iPhone of multiple models (iPhone X or more advanced iPhone models)
## Data format
.mp4, .mov, .jpg, .xml, .json
## Annotation content
label the person ID, race, gender, age, scene, facial action, light condition
## Accuracy
based on the accuracy of the actions, the accuracy exceeds 97%; the accuracy of label annotation is not less than 97%
# Licensing Information
Commercial License
|
jeongseon/cp-final-project-preprocessed | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 48027206752
num_examples: 50000
- name: valid
num_bytes: 2444583512
num_examples: 2545
download_size: 10102545245
dataset_size: 50471790264
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
|
liuyanchen1015/VALUE_rte_lexical | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 85081
num_examples: 240
- name: test
num_bytes: 919558
num_examples: 2621
- name: train
num_bytes: 797382
num_examples: 2157
download_size: 1216681
dataset_size: 1802021
---
# Dataset Card for "VALUE_rte_lexical"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_yall | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 342366
num_examples: 1559
- name: dev_mismatched
num_bytes: 257988
num_examples: 1378
- name: test_matched
num_bytes: 344313
num_examples: 1545
- name: test_mismatched
num_bytes: 250427
num_examples: 1321
- name: train
num_bytes: 14171879
num_examples: 63320
download_size: 9179821
dataset_size: 15366973
---
# Dataset Card for "MULTI_VALUE_mnli_yall"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tiagofvb/reddit_r_carros | ---
license: apache-2.0
---
The Reddit r/carros Conversational Dataset is a collection of text-based conversations sourced from the popular online community, "r/carros." This dataset is compiled to provide a valuable resource for research and analysis in the realm of natural language processing, with a specific focus on automotive-related discussions.
Column Descriptions:
Comment:
The "Comment" column contains the original user-generated text or comment posted by participants within the r/carros subreddit. These comments encompass a diverse array of topics related to automobiles, including discussions about car models, brands, features, maintenance, reviews, and other automotive-related subjects. The language used in the comments may vary in style, tone, and technicality, providing a rich linguistic landscape for exploration.
Reply:
In the "Reply" column, you will find the corresponding responses to the comments made in the "Comment" column. These responses represent reactions, opinions, suggestions, or follow-up statements provided by other members of the r/carros community in the context of the original comment. The replies capture the conversational dynamics and engagement within the subreddit, offering insights into the collective knowledge and experiences of automotive enthusiasts. |
Simonk97/dataset | ---
license: openrail
---
|
Kabatubare/medical | ---
tags:
- healthcare
- qna
- nlp
- english
license: other
language:
- en
pretty_name: Medical QnA Datasets
---
# Dataset Card for "Medical" Healthcare QnA Datasets
## Dataset Details
### Dataset Description
The "Medical" dataset is a specialized subset curated from the larger MedDialog collection, featuring healthcare dialogues between doctors and patients. This dataset focuses on conversations from Icliniq, HealthcareMagic, and HealthTap. Written primarily in English, it is designed to serve a broad range of applications such as NLP research, healthcare chatbot development, and medical information retrieval. The dataset contains 24,000 rows.
- **Data Sources**: Curated from MedDialog, focusing on Icliniq, HealthcareMagic, and HealthTap
- **Size**: 24,000 rows
- **Language**: English
### Direct Uses:
- NLP research in healthcare dialogues
- Development of healthcare question-answering systems
- Medical information retrieval
### Limitations and Recommendations:
- Not a substitute for certified medical advice
- Exercise caution in critical healthcare applications
|
HuggingFaceM4/SpotDifference_4 | Invalid username or password. |
CyberHarem/pola_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pola/ポーラ (Kantai Collection)
This is the dataset of pola/ポーラ (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, grey_hair, wavy_hair, brown_eyes, breasts, hair_between_eyes, hat, mini_hat, large_breasts, thick_eyebrows, tilted_headwear, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 672.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pola_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 364.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pola_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1204 | 788.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pola_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 586.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pola_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1204 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/pola_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pola_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_gloves, solo, white_coat, pink_scarf, upper_body, long_sleeves, white_headwear, looking_at_viewer, grey_coat, official_alternate_costume, simple_background, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, blush, long_sleeves, simple_background, solo, white_background, white_shirt, wine_glass, holding_cup, naked_shirt, open_shirt, sitting, cleavage, looking_at_viewer, open_mouth, smile, closed_eyes, collarbone, collared_shirt, cropped_legs, navel, one-hour_drawing_challenge, sparkle, twitter_username |
| 2 | 5 |  |  |  |  |  | 1girl, corset, long_sleeves, looking_at_viewer, red_bowtie, red_skirt, simple_background, solo, white_background, white_shirt, white_thighhighs, miniskirt, blush, open_mouth, sitting, smile |
| 3 | 13 |  |  |  |  |  | 1girl, solo, upper_body, white_shirt, corset, looking_at_viewer, red_bowtie, simple_background, long_sleeves, white_background, smile, blush, one-hour_drawing_challenge |
| 4 | 5 |  |  |  |  |  | 1girl, blush, solo, wine_glass, drunk, looking_at_viewer, open_mouth, white_thighhighs, wine_bottle, sitting, smile, areola_slip, brown_hair, convenient_censoring, miniskirt, no_panties |
| 5 | 5 |  |  |  |  |  | 1girl, blush, white_shirt, holding_cup, long_sleeves, looking_at_viewer, smile, solo, upper_body, open_mouth, simple_background, white_background, collared_shirt |
| 6 | 18 |  |  |  |  |  | 1girl, fake_animal_ears, rabbit_ears, solo, detached_collar, playboy_bunny, alternate_costume, strapless_leotard, simple_background, wrist_cuffs, cleavage, looking_at_viewer, black_pantyhose, black_leotard, red_bowtie, white_background, wine_bottle, wine_glass, rabbit_tail |
| 7 | 5 |  |  |  |  |  | 1girl, enmaided, looking_at_viewer, maid_headdress, simple_background, solo, black_dress, cowboy_shot, frilled_apron, holding, long_sleeves, white_apron, white_background, cleavage_cutout, alcohol, blush, dated, one-hour_drawing_challenge, wine_glass |
| 8 | 6 |  |  |  |  |  | 1girl, cloud, outdoors, solo, looking_at_viewer, blue_sky, cleavage, cowboy_shot, day, red_bikini, beach, navel, ocean, shirt, water |
| 9 | 8 |  |  |  |  |  | 1girl, solo, alternate_costume, looking_at_viewer, white_dress, blush, smile, cleavage, flower, bouquet, jewelry, wedding_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | solo | white_coat | pink_scarf | upper_body | long_sleeves | white_headwear | looking_at_viewer | grey_coat | official_alternate_costume | simple_background | white_background | blush | white_shirt | wine_glass | holding_cup | naked_shirt | open_shirt | sitting | cleavage | open_mouth | smile | closed_eyes | collarbone | collared_shirt | cropped_legs | navel | one-hour_drawing_challenge | sparkle | twitter_username | corset | red_bowtie | red_skirt | white_thighhighs | miniskirt | drunk | wine_bottle | areola_slip | brown_hair | convenient_censoring | no_panties | fake_animal_ears | rabbit_ears | detached_collar | playboy_bunny | alternate_costume | strapless_leotard | wrist_cuffs | black_pantyhose | black_leotard | rabbit_tail | enmaided | maid_headdress | black_dress | cowboy_shot | frilled_apron | holding | white_apron | cleavage_cutout | alcohol | dated | cloud | outdoors | blue_sky | day | red_bikini | beach | ocean | shirt | water | white_dress | flower | bouquet | jewelry | wedding_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:-------------|:-------------|:-------------|:---------------|:-----------------|:--------------------|:------------|:-----------------------------|:--------------------|:-------------------|:--------|:--------------|:-------------|:--------------|:--------------|:-------------|:----------|:-----------|:-------------|:--------|:--------------|:-------------|:-----------------|:---------------|:--------|:-----------------------------|:----------|:-------------------|:---------|:-------------|:------------|:-------------------|:------------|:--------|:--------------|:--------------|:-------------|:-----------------------|:-------------|:-------------------|:--------------|:------------------|:----------------|:--------------------|:--------------------|:--------------|:------------------|:----------------|:--------------|:-----------|:-----------------|:--------------|:--------------|:----------------|:----------|:--------------|:------------------|:----------|:--------|:--------|:-----------|:-----------|:------|:-------------|:--------|:--------|:--------|:--------|:--------------|:---------|:----------|:----------|:----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | | | X | | X | | | X | X | X | X | | | | | X | | X | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | | X | | | X | X | | X | | | X | X | X | X | | | | | | | | X | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | | | | | X | | | | | X | | X | | | | X | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | X | X | | X | | | X | X | X | X | | X | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 18 |  |  |  |  |  | X | | X | | | | | | X | | | X | X | | | X | | | | | X | | | | | | | | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | | | | X | | X | | | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | X | | | | | | X | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | |
| 9 | 8 |  |  |  |  |  | X | | X | | | | | | X | | | | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
|
anjunhu/naively_captioned_CUB2002011_test_5shot | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_cupl
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 27655072.0
num_examples: 1000
download_size: 27567951
dataset_size: 27655072.0
---
# Dataset Card for "naively_captioned_CUB2002011_test_5shot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sudeepag/sampled-t0_zsopt_data | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: _template_idx
dtype: int64
- name: _task_source
dtype: string
- name: _task_name
dtype: string
- name: _template_type
dtype: string
splits:
- name: train
num_bytes: 3221637126.012135
num_examples: 4165239
download_size: 1784918986
dataset_size: 3221637126.012135
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Abhinav-B/finetune_llama_gpt | ---
dataset_info:
features:
- name: formatted_text
dtype: string
splits:
- name: train
num_bytes: 34788
num_examples: 100
download_size: 8789
dataset_size: 34788
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_10-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_10-7B-slerp](https://huggingface.co/Gille/StrangeMerges_10-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T02:55:04.492502](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp/blob/main/results_2024-02-02T02-55-04.492502.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6542458004549463,\n\
\ \"acc_stderr\": 0.03204861565652575,\n \"acc_norm\": 0.6539758320346176,\n\
\ \"acc_norm_stderr\": 0.03271443876560244,\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6948877994288644,\n\
\ \"mc2_stderr\": 0.014809641585651314\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n\
\ \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252423\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7026488747261501,\n\
\ \"acc_stderr\": 0.004561582009834578,\n \"acc_norm\": 0.8829914359689305,\n\
\ \"acc_norm_stderr\": 0.0032077357692780455\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"\
acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922436,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922436\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6948877994288644,\n\
\ \"mc2_stderr\": 0.014809641585651314\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.01043091746823743\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693639\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_10-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|arc:challenge|25_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|gsm8k|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hellaswag|10_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T02-55-04.492502.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- '**/details_harness|winogrande|5_2024-02-02T02-55-04.492502.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T02-55-04.492502.parquet'
- config_name: results
data_files:
- split: 2024_02_02T02_55_04.492502
path:
- results_2024-02-02T02-55-04.492502.parquet
- split: latest
path:
- results_2024-02-02T02-55-04.492502.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_10-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_10-7B-slerp](https://huggingface.co/Gille/StrangeMerges_10-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T02:55:04.492502](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp/blob/main/results_2024-02-02T02-55-04.492502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6542458004549463,
"acc_stderr": 0.03204861565652575,
"acc_norm": 0.6539758320346176,
"acc_norm_stderr": 0.03271443876560244,
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6948877994288644,
"mc2_stderr": 0.014809641585651314
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292038,
"acc_norm": 0.7235494880546075,
"acc_norm_stderr": 0.013069662474252423
},
"harness|hellaswag|10": {
"acc": 0.7026488747261501,
"acc_stderr": 0.004561582009834578,
"acc_norm": 0.8829914359689305,
"acc_norm_stderr": 0.0032077357692780455
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922436,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922436
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6948877994288644,
"mc2_stderr": 0.014809641585651314
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.01043091746823743
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693639
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.