datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
fxmarty/transformers-regressions | ---
license: mit
---
|
martyn/crazy_code | ---
license: mit
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 13960
num_examples: 7
download_size: 10741
dataset_size: 13960
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Crazy Code dataset
This dataset exists to collect code samples that demonstrate exceptional, near superhuman-level ability.
## WIP
Early in development, create an issue or reach out to me on github / twitter.
## GitHub
See [https://github.com/martyn/crazy_code](https://github.com/martyn/crazy_code) |
phiyodr/InpaintCOCO | ---
pretty_name: InpaintCOCO
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- image-to-text
- text-to-image
- image-classification
task_ids:
- image-captioning
tags:
- coco
- image-captioning
- inpainting
- multimodel-understanding
dataset_info:
features:
- name: concept
dtype: string
- name: coco_caption
dtype: string
- name: coco_image
dtype: image
- name: inpaint_caption
dtype: string
- name: inpaint_image
dtype: image
- name: mask
dtype: image
- name: worker
dtype: string
- name: coco_details
struct:
- name: captions
sequence: string
- name: coco_url
dtype: string
- name: date_captured
dtype: string
- name: flickr_url
dtype: string
- name: height
dtype: int64
- name: id
dtype: int64
- name: image_license
dtype: string
- name: text_license
dtype: string
- name: width
dtype: int64
- name: inpaint_details
struct:
- name: duration
dtype: int64
- name: guidance_scale
dtype: float64
- name: num_inference_steps
dtype: int64
- name: prompt
dtype: string
- name: prompts_used
dtype: int64
- name: quality
dtype: string
- name: mask_details
struct:
- name: height_factor
dtype: int64
- name: prompt
dtype: string
- name: prompts_used
dtype: int64
- name: width_factor
dtype: int64
splits:
- name: test
num_bytes: 1062104623.5
num_examples: 1260
download_size: 1055968442
dataset_size: 1062104623.5
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# InpaintCOCO - Fine-grained multimodal concept understanding
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
A data sample contains 2 images and 2 corresponding captions that differ only in one object, the color of an object, or the size of an object.
> Many multimodal tasks, such as Vision-Language Retrieval and Visual Question Answering, present results in terms of overall performance.
> Unfortunately, this approach overlooks more nuanced concepts, leaving us unaware of which specific concepts contribute to the success of current models and which are ignored.
> In response to this limitation, more recent benchmarks attempt to assess particular aspects of vision-language models.
> Some existing datasets focus on linguistic concepts utilizing one image paired with multiple captions; others adopt a visual or cross-modal perspective.
> In this study, we are particularly interested in fine-grained visual concept understanding, which we believe is not covered in existing benchmarks in sufficient isolation.
> Therefore, we create the InpaintCOCO dataset which consists of image pairs with minimum differences that lead to changes in the captions.
Download the dataset:
```python
from datasets import load_dataset
dataset = load_dataset("phiyodr/inpaintCOCO")
```
### Supported Tasks and Leaderboards
InpaintCOCO is a benchmark to understand fine-grained concepts in multimodal models (vision-language) similar to [Winoground](https://huggingface.co/datasets/facebook/winoground).
To our knowledge InpaintCOCO is the first benchmark, which consists of image pairs with minimum differences, so that the *visual* representation can be analyzed in a more standardized setting.
### Languages
All texts are in English.
## Dataset Structure
```python
DatasetDict({
test: Dataset({
features: ['concept', 'coco_caption', 'coco_image',
'inpaint_caption', 'inpaint_image',
'mask', 'worker', 'coco_details', 'inpaint_details', 'mask_details'],
num_rows: 1260
})
})
```
### Data Instances
An example looks as follows:
```python
{'concept': 'object',
'coco_caption': 'A closeup of a large stop sign in the bushes.',
'coco_image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=512x512>,
'inpaint_caption': 'A wooden bench in the bushes.',
'inpaint_image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=512x512>,
'mask': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=512x512>,
'worker': 'k',
'coco_details': {'captions': ['A stop sign is shown among foliage and grass.',
'A close up of a Stop sign near woods. ',
'A closeup of a large stop sign in the bushes.',
'A large oval Stop sign near some trees.',
'a close up of a stop sign with trees in the background'],
'coco_url': 'http://images.cocodataset.org/val2017/000000252332.jpg',
'date_captured': '2013-11-17 08:29:48',
'flickr_url': 'http://farm6.staticflickr.com/5261/5836914735_bef9249442_z.jpg',
'height': 480,
'id': 252332,
'image_license': 'https://creativecommons.org/licenses/by/2.0/',
'text_license': 'https://creativecommons.org/licenses/by/4.0/legalcode',
'width': 640},
'inpaint_details': {'duration': 18,
'guidance_scale': 7.5,
'num_inference_steps': 100,
'prompt': 'wooden bench',
'prompts_used': 2,
'quality': 'very good'},
'mask_details': {'height_factor': 25,
'prompt': 'stop sign',
'prompts_used': 1,
'width_factor': 25}}
```
## Dataset Creation
> The challenge set was created by undergraduate student workers. They were provided with an interactive Python environment with which they interacted via various prompts and inputs.
> The annotation proceeds as follows: The annotators are provided with an image and decide if the image is suitable for editing. If yes, they input the prompt for the object that should be replaced.
Using the open vocabulary segmentation model [CLIPSeg](https://huggingface.co/CIDAS/clipseg-rd64-refined) ([Lüddecke and Ecker, 2022](https://openaccess.thecvf.com/content/CVPR2022/html/Luddecke_Image_Segmentation_Using_Text_and_Image_Prompts_CVPR_2022_paper.html)) we obtain a mask for our object of interest (i.e., "fire hydrant"). Then, the annotator inputs a prompt for [Stable Diffusion v2 Inpainting](https://huggingface.co/stabilityai/stable-diffusion-2-inpainting) ([Rombach et al., 2022](https://ommer-lab.com/research/latent-diffusion-models/)) (e.g. with the prompt "yellow fire hydrant"), which shows three candidate images.
The annotators can try new prompts or skip the current image if the result is insufficient. Finally, the annotator enters a new caption that matches the edited image.
#### Source Data
InpaintCOCO is based on MS COCO 2017 validation set ([image](http://images.cocodataset.org/zips/val2017.zip), [annotations](http://images.cocodataset.org/annotations/annotations_trainval2014.zip)).
```
@misc{lin2015microsoft,
title={Microsoft COCO: Common Objects in Context},
author={Tsung-Yi Lin and Michael Maire and Serge Belongie and Lubomir Bourdev and Ross Girshick and James Hays and Pietro Perona and Deva Ramanan and C. Lawrence Zitnick and Piotr Dollár},
year={2015},
eprint={1405.0312},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## Limitations
> The images in the COCO dataset come from Flickr from 2014; therefore, they reflect the Flickr user structure at that time, i.e., the images mostly show the Western world and/or other countries from the Western perspective. The captions are in English. Thus, the model we developed does not generalize well beyond the Western world
## Licensing Information
* Images come with individual licenses (`image_license`) based on their Flickr source. The possible licenses are
* [CC BY-NC-SA 2.0 Deed](https://creativecommons.org/licenses/by-nc-sa/2.0/),
* [CC BY-NC 2.0 Deed](https://creativecommons.org/licenses/by-nc/2.0/),
* [CC BY 2.0 Deed](https://creativecommons.org/licenses/by/2.0/), and
* [CC BY-SA 2.0 Deed](https://creativecommons.org/licenses/by-sa/2.0/).
* The remaining work comes with the [CC BY 4.0 Legal Code](https://creativecommons.org/licenses/by/4.0/legalcode) license.
## Citation Information
Our InpaintCOCO dataset:
```
@inproceedings{Roesch2022Enhancing,
title={Enhancing Conceptual Understanding in Multimodal Contrastive Learning through Hard Negative Samples},
url={},
author={Rösch, Philipp J. and Oswald, Nobert and Geierhos, Michaela and Libovický, Jindřich},
year={2023}
}
```
For the MS COCO dataset please see above. |
lansinuote/diffusion.7.control_net | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 453988831.0
num_examples: 50000
download_size: 0
dataset_size: 453988831.0
---
# Dataset Card for "diffusion.7.control_net"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kenhktsui/minipile_quality_score_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: quality_score_v1
dtype: float64
splits:
- name: validation
num_bytes: 2783386
num_examples: 500
- name: train
num_bytes: 5914108510
num_examples: 1000000
- name: test
num_bytes: 58638191
num_examples: 10000
download_size: 3183576298
dataset_size: 5975530087
language:
- en
task_categories:
- text-generation
---
# Dataset Card for "minipile_quality_score_v1"
Adding quality score v1 to [JeanKaddour/minipile](https://huggingface.co/datasets/JeanKaddour/minipile)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tanvirsrbd1/nov1_annotated_segmented | ---
dataset_info:
features:
- name: html
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1948259
num_examples: 3107
download_size: 643703
dataset_size: 1948259
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "nov1_annotated_segmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sxfly/AesData | ---
license: apache-2.0
---
|
ibranze/araproje_hellaswag_tr_conf_gpt_bestscore_reversed | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87090
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_gpt_bestscore_reversed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vinnyyw/Dulcemoney | ---
license: openrail
---
|
open-llm-leaderboard/details_ABX-AI__Silver-Sun-v2-11B | ---
pretty_name: Evaluation run of ABX-AI/Silver-Sun-v2-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ABX-AI/Silver-Sun-v2-11B](https://huggingface.co/ABX-AI/Silver-Sun-v2-11B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ABX-AI__Silver-Sun-v2-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T11:22:55.196822](https://huggingface.co/datasets/open-llm-leaderboard/details_ABX-AI__Silver-Sun-v2-11B/blob/main/results_2024-04-09T11-22-55.196822.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6597078611621988,\n\
\ \"acc_stderr\": 0.03122053160779182,\n \"acc_norm\": 0.6714931595759738,\n\
\ \"acc_norm_stderr\": 0.032044191906105295,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6249066782001546,\n\
\ \"mc2_stderr\": 0.015682564306191936\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.6988054607508533,\n \"acc_norm_stderr\": 0.013406741767847638\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6938856801433977,\n\
\ \"acc_stderr\": 0.00459935892090954,\n \"acc_norm\": 0.8781119298944433,\n\
\ \"acc_norm_stderr\": 0.0032648787375868862\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"\
acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.832258064516129,\n\
\ \"acc_stderr\": 0.021255464065371325,\n \"acc_norm\": 0.832258064516129,\n\
\ \"acc_norm_stderr\": 0.021255464065371325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194208,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194208\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846322,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846322\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878467,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878467\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.01424887354921757,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.01424887354921757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4994413407821229,\n\
\ \"acc_stderr\": 0.016722491114073344,\n \"acc_norm\": 0.4994413407821229,\n\
\ \"acc_norm_stderr\": 0.016722491114073344\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.524822695035461,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49934810951760106,\n\
\ \"acc_stderr\": 0.012770225252255555,\n \"acc_norm\": 0.49934810951760106,\n\
\ \"acc_norm_stderr\": 0.012770225252255555\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.02604066247420124,\n\
\ \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.02604066247420124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488688,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488688\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960227,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960227\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6249066782001546,\n\
\ \"mc2_stderr\": 0.015682564306191936\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501789\n }\n}\n```"
repo_url: https://huggingface.co/ABX-AI/Silver-Sun-v2-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-22-55.196822.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-22-55.196822.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- '**/details_harness|winogrande|5_2024-04-09T11-22-55.196822.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T11-22-55.196822.parquet'
- config_name: results
data_files:
- split: 2024_04_09T11_22_55.196822
path:
- results_2024-04-09T11-22-55.196822.parquet
- split: latest
path:
- results_2024-04-09T11-22-55.196822.parquet
---
# Dataset Card for Evaluation run of ABX-AI/Silver-Sun-v2-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ABX-AI/Silver-Sun-v2-11B](https://huggingface.co/ABX-AI/Silver-Sun-v2-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ABX-AI__Silver-Sun-v2-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T11:22:55.196822](https://huggingface.co/datasets/open-llm-leaderboard/details_ABX-AI__Silver-Sun-v2-11B/blob/main/results_2024-04-09T11-22-55.196822.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6597078611621988,
"acc_stderr": 0.03122053160779182,
"acc_norm": 0.6714931595759738,
"acc_norm_stderr": 0.032044191906105295,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6249066782001546,
"mc2_stderr": 0.015682564306191936
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.6988054607508533,
"acc_norm_stderr": 0.013406741767847638
},
"harness|hellaswag|10": {
"acc": 0.6938856801433977,
"acc_stderr": 0.00459935892090954,
"acc_norm": 0.8781119298944433,
"acc_norm_stderr": 0.0032648787375868862
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.832258064516129,
"acc_stderr": 0.021255464065371325,
"acc_norm": 0.832258064516129,
"acc_norm_stderr": 0.021255464065371325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194208,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846322,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878467,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878467
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.01424887354921757,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.01424887354921757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4994413407821229,
"acc_stderr": 0.016722491114073344,
"acc_norm": 0.4994413407821229,
"acc_norm_stderr": 0.016722491114073344
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341062,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341062
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49934810951760106,
"acc_stderr": 0.012770225252255555,
"acc_norm": 0.49934810951760106,
"acc_norm_stderr": 0.012770225252255555
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.02604066247420124,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.02604066247420124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488688,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488688
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960227,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960227
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090081,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090081
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6249066782001546,
"mc2_stderr": 0.015682564306191936
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501789
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jamesjazz/perturb_seq | ---
license: mit
---
|
k0ntra/Aladdin | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
splits:
- name: train
num_bytes: 116736
num_examples: 76
download_size: 0
dataset_size: 116736
---
# Dataset Card for "Aladdin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_5_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4398480
num_examples: 11937
download_size: 1933043
dataset_size: 4398480
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_5_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
isha-techies/call-center-qa-squad-2 | ---
task_categories:
- question-answering
language:
- en
--- |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/46bc615b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1337
dataset_size: 184
---
# Dataset Card for "46bc615b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nc33/entailment | ---
license: mit
---
|
saraimarte/flowerVase | ---
license: other
---
|
CyberHarem/jean_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jean/ジン/琴 (Genshin Impact)
This is the dataset of jean/ジン/琴 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `blonde_hair, ponytail, long_hair, blue_eyes, breasts, sidelocks, hair_between_eyes, bow, hair_bow, large_breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1009.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jean_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 834.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jean_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1342 | 1.65 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jean_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jean_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, holding_sword, solo, white_pants, strapless, detached_sleeves, cleavage, blue_capelet, looking_at_viewer, belt, corset, tight_clothes, tight_pants, black_gloves, closed_mouth, gauntlets, detached_collar |
| 1 | 6 |  |  |  |  |  | 1girl, blue_capelet, cleavage, cowboy_shot, gloves, solo, standing, strapless, white_pants, belt, blush, detached_sleeves, tight_clothes, corset, dandelion, detached_collar, holding_flower, looking_at_viewer, simple_background, tight_pants, white_background, bare_shoulders, black_bow, gauntlets, hand_up, parted_lips |
| 2 | 15 |  |  |  |  |  | 1girl, blue_capelet, cleavage, looking_at_viewer, solo, white_pants, corset, tight_clothes, detached_sleeves, strapless, tight_pants, blush, detached_collar, belt, bare_shoulders, black_gloves, closed_mouth, gauntlets, smile, simple_background, sitting, white_background |
| 3 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, upper_body, cleavage, detached_collar, detached_sleeves, simple_background, bare_shoulders, closed_mouth, white_background, armpits, corset, capelet, strapless_shirt |
| 4 | 5 |  |  |  |  |  | 1girl, blue_sky, cleavage, cloud, detached_collar, detached_sleeves, strapless, upper_body, blue_capelet, dandelion, day, outdoors, solo, black_gloves, black_bow, closed_mouth, corset |
| 5 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blue_shirt, closed_mouth, detached_sleeves, looking_at_viewer, official_alternate_costume, smile, solo, upper_body, blush, black_bow |
| 6 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blue_rose, blue_sky, cloud, day, detached_sleeves, looking_at_viewer, official_alternate_costume, solo, beach, belt, high-waist_shorts, ocean, outdoors, white_shorts, blue_shirt, blush, open_mouth, thighs, water, :d, shore, standing, thigh_strap |
| 7 | 7 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, cleavage, collarbone, cowboy_shot, midriff, navel, stomach, armpits, bare_arms, closed_mouth, yoga_pants, arms_up, blush, standing, alternate_costume, arm_up, arms_behind_head, black_bow, black_pants, earrings, simple_background, sportswear, sweat, tight_pants, white_sports_bra |
| 8 | 7 |  |  |  |  |  | 1girl, solo, alternate_costume, white_shirt, cross_earrings, looking_at_viewer, belt, black_bow, black_pants, blush, cowboy_shot, long_sleeves, simple_background, white_background, bracelet, closed_mouth, collared_shirt, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | holding_sword | solo | white_pants | strapless | detached_sleeves | cleavage | blue_capelet | looking_at_viewer | belt | corset | tight_clothes | tight_pants | black_gloves | closed_mouth | gauntlets | detached_collar | cowboy_shot | gloves | standing | blush | dandelion | holding_flower | simple_background | white_background | bare_shoulders | black_bow | hand_up | parted_lips | smile | sitting | upper_body | armpits | capelet | strapless_shirt | blue_sky | cloud | day | outdoors | blue_shirt | official_alternate_costume | blue_rose | beach | high-waist_shorts | ocean | white_shorts | open_mouth | thighs | water | :d | shore | thigh_strap | collarbone | midriff | navel | stomach | bare_arms | yoga_pants | arms_up | alternate_costume | arm_up | arms_behind_head | black_pants | earrings | sportswear | sweat | white_sports_bra | white_shirt | cross_earrings | long_sleeves | bracelet | collared_shirt | jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-------|:--------------|:------------|:-------------------|:-----------|:---------------|:--------------------|:-------|:---------|:----------------|:--------------|:---------------|:---------------|:------------|:------------------|:--------------|:---------|:-----------|:--------|:------------|:-----------------|:--------------------|:-------------------|:-----------------|:------------|:----------|:--------------|:--------|:----------|:-------------|:----------|:----------|:------------------|:-----------|:--------|:------|:-----------|:-------------|:-----------------------------|:------------|:--------|:--------------------|:--------|:---------------|:-------------|:---------|:--------|:-----|:--------|:--------------|:-------------|:----------|:--------|:----------|:------------|:-------------|:----------|:--------------------|:---------|:-------------------|:--------------|:-----------|:-------------|:--------|:-------------------|:--------------|:-----------------|:---------------|:-----------|:-----------------|:---------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | | | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | X | | | X | X | | X | | X | | | | X | | X | | | | | | | X | X | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | X | X | X | X | | | X | | | X | X | | X | | | | | X | | | | | X | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | | X | | | X | | | | | | X | | | | | | X | | | | | X | X | | | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | | | X | | | X | X | | | | | | | | | | X | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | X | | | | X | | X | | | | X | | X | | | X | | X | X | | | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | X | | | | | | X | X | | | | | X | | | X | | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X |
|
Lancelot53/srbd1_segmented | ---
dataset_info:
features:
- name: html
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1446076
num_examples: 1496
download_size: 0
dataset_size: 1446076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "srbd1_segmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dumyy/Summary_CC | ---
license: apache-2.0
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 2064638
num_examples: 517
- name: test
num_bytes: 2157246
num_examples: 500
download_size: 459521
dataset_size: 4221884
---
|
liuyanchen1015/MULTI_VALUE_mrpc_superlative_before_matrix_head | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 244
num_examples: 1
download_size: 3382
dataset_size: 244
---
# Dataset Card for "MULTI_VALUE_mrpc_superlative_before_matrix_head"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/19100_chat_128x_slot_empty | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 1366291
num_examples: 8192
- name: validation
num_bytes: 4861
num_examples: 32
- name: test
num_bytes: 646729
num_examples: 3731
download_size: 0
dataset_size: 2017881
---
# Dataset Card for "19100_chat_128x_slot_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aisc-team-d2/liveqa | ---
dataset_info:
features:
- name: NIST_PARAPHRASE
dtype: string
- name: NLM_SUMMARY
dtype: string
- name: REFERENCE_ANSWERS
list:
- name: ANSWER
dtype: string
- name: AnswerURL
dtype: string
- name: COMMENT
dtype: string
- name: _aid
dtype: string
- name: QUESTION_ID
dtype: string
- name: ORIGINAL_QUESTION_SUBJECT
dtype: string
- name: ORIGINAL_QUESTION_MESSAGE
dtype: string
- name: ORIGINAL_QUESTION_FILE
dtype: string
- name: ANNOTATIONS_FOCUS
list:
- name: _fid
dtype: string
- name: _fcategory
dtype: string
- name: __text
dtype: string
- name: ANNOTATIONS_TYPE
list:
- name: _tid
dtype: string
- name: _hasFocus
dtype: string
- name: __text
dtype: string
- name: _hasKeyword
dtype: string
- name: ANNOTATIONS_KEYWORD
list:
- name: _kid
dtype: string
- name: _kcategory
dtype: string
- name: __text
dtype: string
splits:
- name: test
num_bytes: 212327
num_examples: 104
download_size: 139923
dataset_size: 212327
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
shrinivas1510/open_Orca_preprocessed | ---
license: mit
---
|
voidful/NMSQA-CODE | ---
language: en
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: audio_full_answer_end
sequence: float64
- name: audio_full_answer_start
sequence: float64
- name: audio_segment_answer_end
sequence: float64
- name: audio_segment_answer_start
sequence: float64
- name: text
sequence: string
- name: content_segment_audio_path
dtype: string
- name: content_full_audio_path
dtype: string
- name: content_audio_sampling_rate
dtype: float64
- name: content_audio_speaker
dtype: string
- name: content_segment_text
dtype: string
- name: content_segment_normalized_text
dtype: string
- name: question_audio_path
dtype: string
- name: question_audio_sampling_rate
dtype: float64
- name: question_audio_speaker
dtype: string
- name: question_normalized_text
dtype: string
- name: hubert_100_context_unit
dtype: string
- name: hubert_100_question_unit
dtype: string
- name: hubert_100_answer_unit
dtype: string
- name: mhubert_1000_context_unit
dtype: string
- name: mhubert_1000_question_unit
dtype: string
- name: mhubert_1000_answer_unit
dtype: string
splits:
- name: train
num_bytes: 3329037982
num_examples: 87599
- name: test
num_bytes: 1079782
num_examples: 171
- name: dev
num_bytes: 411186265
num_examples: 10570
download_size: 507994561
dataset_size: 3741304029
---
# Dataset Card for "NMSQA-CODE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aditya685/filtered_data_80k | ---
language:
- en
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 290184604
num_examples: 80000
download_size: 120878335
dataset_size: 290184604
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yenping/training-data | ---
license: apache-2.0
language:
- zh
--- |
tyzhu/find_sent_after_sent_train_400_eval_40_no_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 5866475.834053587
num_examples: 4188
- name: validation
num_bytes: 232483
num_examples: 200
download_size: 1126325
dataset_size: 6098958.834053587
---
# Dataset Card for "find_sent_after_sent_train_400_eval_40_no_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cp500/tokenized_medical_NER | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 25177853.289020833
num_examples: 8943
- name: test
num_bytes: 2798477.710979169
num_examples: 994
download_size: 4069580
dataset_size: 27976331.0
---
# Dataset Card for "tokenized_medical_NER"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yijia-Xiao/pii-wikidoc | ---
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: cleaned_output
dtype: string
splits:
- name: train
num_bytes: 19486545
num_examples: 10000
download_size: 10662804
dataset_size: 19486545
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pii-wikidoc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/oasst_top1_standardized_cluster_0_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 5241595
num_examples: 3029
download_size: 3098315
dataset_size: 5241595
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst_top1_standardized_cluster_0_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nadav/Clemt | ---
dataset_info:
features:
- name: text
dtype: string
- name: file
dtype: string
splits:
- name: train
num_bytes: 180028095
num_examples: 300
- name: test
num_bytes: 18814514
num_examples: 33
download_size: 117182541
dataset_size: 198842609
---
# Dataset Card for "Clemt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
relaxasian/piccolo | ---
license: openrail
---
|
alisson40889/pablo | ---
license: openrail
---
|
thaottn/DataComp_large_pool_BLIP2_captions | ---
license: cc-by-4.0
task_categories:
- image-to-text
- zero-shot-classification
size_categories:
- 1B<n<10B
---
# Dataset Card for DataComp_large_pool_BLIP2_captions
## Dataset Description
- **Paper: https://arxiv.org/abs/2307.10350**
- **Leaderboard: https://www.datacomp.ai/leaderboard.html**
- **Point of Contact: Thao Nguyen (thaottn@cs.washington.edu)**
### Dataset Summary
### Supported Tasks and Leaderboards
We have used this dataset for pre-training CLIP models and found that it rivals or outperforms models trained on raw web captions on average across the 38 evaluation tasks proposed by DataComp.
Refer to the DataComp leaderboard (https://www.datacomp.ai/leaderboard.html) for the top baselines uncovered in our work.
### Languages
Primarily English.
## Dataset Structure
### Data Instances
Each instance maps a unique image identifier from DataComp to the corresponding BLIP2 caption generated with temperature 0.75.
### Data Fields
uid: SHA256 hash of image, provided as metadata by the DataComp team.
blip2-cap: corresponding caption generated by BLIP2.
### Data Splits
Data was not split. The dataset is intended for pre-training multimodal models.
## Dataset Creation
### Curation Rationale
Web-crawled image-text data can contain a lot of noise, i.e. the caption may not reflect the content of the respective image. Filtering out noisy web data, however, can hurt the diversity of the training set.
To address both of these issues, we use image captioning models to increase the number of useful training samples from the initial pool, by ensuring the captions are more relevant to the images.
Our work systematically explores the effectiveness of using these synthetic captions to replace or complement the raw text data, in the context of CLIP pre-training.
### Source Data
#### Initial Data Collection and Normalization
The original 1.28M image-text pairs were collected by the DataComp team from Common Crawl. Minimal filtering was performed on the initial data pool (face blurring, NSFW removal, train-test deduplication).
We then replaced the original web-crawled captions with synthetic captions generated by BLIP2.
#### Who are the source language producers?
Common Crawl is the source for images. BLIP2 is the source of the text data.
### Annotations
#### Annotation process
The dataset was built in a fully automated process: captions are generated by the BLIP2 captioning model.
#### Who are the annotators?
No human annotators are involved.
### Personal and Sensitive Information
The images, which we inherit from the DataComp benchmark, already underwent face detection and face blurring. While the DataComp team made an attempt to remove NSFW instances, it is possible that such content may still exist (to a small degree) in this dataset.
Due to the large scale nature of this dataset, the content has not been manually verified to be completely safe. Therefore, it is strongly recommended that this dataset be used only for research purposes.
## Considerations for Using the Data
### Social Impact of Dataset
The publication contains some preliminary analyses of the fairness implication of training on this dataset, when evaluating on Fairface.
### Discussion of Biases
Refer to the publication for more details.
### Other Known Limitations
Refer to the publication for more details.
## Additional Information
### Citation Information
```bibtex
@article{nguyen2023improving,
title={Improving Multimodal Datasets with Image Captioning},
author={Nguyen, Thao and Gadre, Samir Yitzhak and Ilharco, Gabriel and Oh, Sewoong and Schmidt, Ludwig},
journal={arXiv preprint arXiv:2307.10350},
year={2023}
}
``` |
patruff/chucklesF1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 180104
num_examples: 1183
- name: test
num_bytes: 45047
num_examples: 296
download_size: 47299
dataset_size: 225151
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
dcassine/trevor | ---
license: unknown
---
|
CyberHarem/mash_kyrielight_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mash_kyrielight/マシュ・キリエライト/玛修·基列莱特 (Fate/Grand Order)
This is the dataset of mash_kyrielight/マシュ・キリエライト/玛修·基列莱特 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `short_hair, purple_eyes, hair_over_one_eye, breasts, pink_hair, large_breasts, glasses, medium_breasts, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 722.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mash_kyrielight_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 632.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mash_kyrielight_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1262 | 1.21 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mash_kyrielight_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mash_kyrielight_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, solo, thighhighs, armored_dress, bare_shoulders, lord_camelot_(fate), thigh_strap, black_leotard, purple_gloves, closed_mouth, holding_shield, standing, armored_boots, ass, smile |
| 1 | 5 |  |  |  |  |  | 1girl, armored_dress, bare_shoulders, blush, cloud, elbow_gloves, looking_at_viewer, outdoors, solo, black_thighhighs, lord_camelot_(fate), navel_cutout, shield, smile, closed_mouth, day, dutch_angle, petals, thigh_strap, black_gloves, blue_sky, cowboy_shot, eyes_visible_through_hair, holding, leotard, purple_gloves, standing, water |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_leotard, cleavage_cutout, elbow_gloves, looking_at_viewer, solo, white_background, highleg_leotard, navel_cutout, open_mouth, simple_background, thigh_strap, thighs, black_thighhighs, blush, purple_gloves, shield, black_gloves, boots, eyes_visible_through_hair, groin, holding, lord_camelot_(fate), smile, standing |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, solo, blush, navel, white_bikini, open_mouth, simple_background, white_background, front-tie_top, official_alternate_costume, outdoors |
| 4 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, bare_shoulders, cleavage, collarbone, official_alternate_costume, striped_bikini, striped_clothes, blush, multicolored_bikini, navel, see-through, outdoors, side-tie_bikini_bottom, thighs, blue_sky, cloud, open_mouth, day, cowboy_shot, ocean |
| 5 | 6 |  |  |  |  |  | 1girl, blush, cleavage, dress_swimsuit, looking_at_viewer, official_alternate_costume, solo, white_dress, bare_shoulders, collarbone, :d, open_mouth, pink_bow, black-framed_eyewear, dress_bow, long_sleeves, open_jacket, white_background |
| 6 | 9 |  |  |  |  |  | 1girl, bare_shoulders, hair_flower, looking_at_viewer, official_alternate_costume, solo, white_dress, white_flower, off-shoulder_dress, smile, white_gloves, blush, cleavage, closed_mouth, collarbone, ribbon_choker, eyes_visible_through_hair, from_side, upper_body |
| 7 | 36 |  |  |  |  |  | 1girl, looking_at_viewer, red_necktie, solo, black_dress, open_jacket, blush, long_sleeves, smile, black-framed_eyewear, closed_mouth, black_pantyhose, hood, white_background, simple_background, grey_jacket, collared_dress |
| 8 | 7 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, playboy_bunny, solo, wrist_cuffs, black_leotard, detached_collar, fake_animal_ears, looking_at_viewer, rabbit_ears, blush, black_thighhighs, open_mouth, rabbit_tail, strapless_leotard, black_pantyhose, cleavage, drinking_glass, eyes_visible_through_hair, from_behind, holding, smile, tray, white_background |
| 9 | 9 |  |  |  |  |  | 1girl, bare_shoulders, blush, looking_at_viewer, panties, solo, thighs, smile, bra, closed_mouth, collarbone, navel, alternate_costume, cleavage, lying, underwear_only, stomach |
| 10 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, alternate_costume, bare_shoulders, cleavage, navel, stomach, collarbone, simple_background, white_background, bow_bra, cowboy_shot, underwear_only, white_shirt, open_mouth, standing, ass_visible_through_thighs, bow_panties, eyes_visible_through_hair, long_sleeves, off_shoulder, open_shirt, plaid, purple_panties, undressing, white_bra |
| 11 | 10 |  |  |  |  |  | 1girl, maid_headdress, solo, black_dress, looking_at_viewer, waist_apron, enmaided, blush, frills, white_apron, bowtie, cleavage_cutout, closed_mouth, smile, white_thighhighs, black_gloves, cup, elbow_gloves, holding, puffy_short_sleeves, ribbon, sitting, wrist_cuffs |
| 12 | 13 |  |  |  |  |  | 1girl, floral_print, long_sleeves, looking_at_viewer, solo, alternate_costume, blush, wide_sleeves, hair_flower, holding, obi, :d, blurry, open_mouth, closed_mouth, outdoors, yukata, night |
| 13 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, elbow_gloves, fur_collar, halloween_costume, looking_at_viewer, official_alternate_costume, revealing_clothes, solo, wolf_ears, open_mouth, purple_tail, wolf_tail, cleavage, fang, fur-trimmed_gloves, fur-trimmed_legwear, purple_gloves, purple_thighhighs, navel, simple_background, white_background, :d, claw_pose, eyes_visible_through_hair, o-ring_top, pink_bow |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | looking_at_viewer | solo | thighhighs | armored_dress | bare_shoulders | lord_camelot_(fate) | thigh_strap | black_leotard | purple_gloves | closed_mouth | holding_shield | standing | armored_boots | ass | smile | blush | cloud | outdoors | black_thighhighs | navel_cutout | shield | day | dutch_angle | petals | black_gloves | blue_sky | cowboy_shot | eyes_visible_through_hair | holding | leotard | water | cleavage_cutout | white_background | highleg_leotard | open_mouth | simple_background | thighs | boots | groin | cleavage | collarbone | navel | white_bikini | front-tie_top | official_alternate_costume | striped_bikini | striped_clothes | multicolored_bikini | see-through | side-tie_bikini_bottom | ocean | dress_swimsuit | white_dress | :d | pink_bow | black-framed_eyewear | dress_bow | long_sleeves | open_jacket | hair_flower | white_flower | off-shoulder_dress | white_gloves | ribbon_choker | from_side | upper_body | red_necktie | black_dress | black_pantyhose | hood | grey_jacket | collared_dress | alternate_costume | playboy_bunny | wrist_cuffs | detached_collar | fake_animal_ears | rabbit_ears | rabbit_tail | strapless_leotard | drinking_glass | from_behind | tray | panties | bra | lying | underwear_only | stomach | bow_bra | white_shirt | ass_visible_through_thighs | bow_panties | off_shoulder | open_shirt | plaid | purple_panties | undressing | white_bra | maid_headdress | waist_apron | enmaided | frills | white_apron | bowtie | white_thighhighs | cup | puffy_short_sleeves | ribbon | sitting | floral_print | wide_sleeves | obi | blurry | yukata | night | fur_collar | halloween_costume | revealing_clothes | wolf_ears | purple_tail | wolf_tail | fang | fur-trimmed_gloves | fur-trimmed_legwear | purple_thighhighs | claw_pose | o-ring_top |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:--------------------|:-------|:-------------|:----------------|:-----------------|:----------------------|:--------------|:----------------|:----------------|:---------------|:-----------------|:-----------|:----------------|:------|:--------|:--------|:--------|:-----------|:-------------------|:---------------|:---------|:------|:--------------|:---------|:---------------|:-----------|:--------------|:----------------------------|:----------|:----------|:--------|:------------------|:-------------------|:------------------|:-------------|:--------------------|:---------|:--------|:--------|:-----------|:-------------|:--------|:---------------|:----------------|:-----------------------------|:-----------------|:------------------|:----------------------|:--------------|:-------------------------|:--------|:-----------------|:--------------|:-----|:-----------|:-----------------------|:------------|:---------------|:--------------|:--------------|:---------------|:---------------------|:---------------|:----------------|:------------|:-------------|:--------------|:--------------|:------------------|:-------|:--------------|:-----------------|:--------------------|:----------------|:--------------|:------------------|:-------------------|:--------------|:--------------|:--------------------|:-----------------|:--------------|:-------|:----------|:------|:--------|:-----------------|:----------|:----------|:--------------|:-----------------------------|:--------------|:---------------|:-------------|:--------|:-----------------|:-------------|:------------|:-----------------|:--------------|:-----------|:---------|:--------------|:---------|:-------------------|:------|:----------------------|:---------|:----------|:---------------|:---------------|:------|:---------|:---------|:--------|:-------------|:--------------------|:--------------------|:------------|:--------------|:------------|:-------|:---------------------|:----------------------|:--------------------|:------------|:-------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | X | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | | | X | | | X | X | | | X | X | X | | | | X | | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | X | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | X | | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | | X | X | | | X | | | | | | | | | | X | X | X | X | | | | X | | | | X | X | | | | | | | | X | | X | | | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | X | | | | | X | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | X | | | X | | | | | X | | | | | X | X | | | | | | | | | | | | X | | | | | | | | | | | | X | X | | | | X | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 36 |  |  |  |  |  | X | | X | X | | | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | X | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | X | X | | | X | | | X | | | | | | | X | X | | | X | | | | | | | | | X | X | | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | | X | X | | | X | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 8 |  |  |  |  |  | X | | X | X | | | X | | | | | | | X | | | | X | | | | | | | | | | | X | X | | | | | X | | X | X | | | | X | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 10 |  |  |  |  |  | X | X | X | X | | | | | | | | X | | | | | X | X | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 12 | 13 |  |  |  |  |  | X | | X | X | | | | | | | | X | | | | | | X | | X | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | X | X | X | X | | | X | | | | X | | | | | | | X | | | | | | | | | | | | X | | | | | X | | X | X | | | | X | | X | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_jondurbin__bagel-7b-v0.4 | ---
pretty_name: Evaluation run of jondurbin/bagel-7b-v0.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/bagel-7b-v0.4](https://huggingface.co/jondurbin/bagel-7b-v0.4) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__bagel-7b-v0.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T12:18:51.743149](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-7b-v0.4/blob/main/results_2024-02-09T12-18-51.743149.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6224226507357447,\n\
\ \"acc_stderr\": 0.03300491139206905,\n \"acc_norm\": 0.6261475680953128,\n\
\ \"acc_norm_stderr\": 0.03367429602929055,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5420385268751854,\n\
\ \"mc2_stderr\": 0.015218334200579092\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735567,\n\
\ \"acc_norm\": 0.6356655290102389,\n \"acc_norm_stderr\": 0.014063260279882419\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6235809599681338,\n\
\ \"acc_stderr\": 0.004834969412883641,\n \"acc_norm\": 0.826727743477395,\n\
\ \"acc_norm_stderr\": 0.0037770896070954763\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340354,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340354\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3925925925925926,\n \"acc_stderr\": 0.02977384701253297,\n \
\ \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.02977384701253297\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010344,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010344\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179337,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179337\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001501,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001501\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.02586220185227789,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.02586220185227789\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n\
\ \"acc_stderr\": 0.015218109544410184,\n \"acc_norm\": 0.2927374301675978,\n\
\ \"acc_norm_stderr\": 0.015218109544410184\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427906,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427906\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.026730620728004906,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.026730620728004906\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868055,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868055\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765846,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765846\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355442,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355442\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5420385268751854,\n\
\ \"mc2_stderr\": 0.015218334200579092\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710686\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47308567096285065,\n \
\ \"acc_stderr\": 0.013752517189717465\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/bagel-7b-v0.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-18-51.743149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-18-51.743149.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- '**/details_harness|winogrande|5_2024-02-09T12-18-51.743149.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T12-18-51.743149.parquet'
- config_name: results
data_files:
- split: 2024_02_09T12_18_51.743149
path:
- results_2024-02-09T12-18-51.743149.parquet
- split: latest
path:
- results_2024-02-09T12-18-51.743149.parquet
---
# Dataset Card for Evaluation run of jondurbin/bagel-7b-v0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jondurbin/bagel-7b-v0.4](https://huggingface.co/jondurbin/bagel-7b-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__bagel-7b-v0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T12:18:51.743149](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-7b-v0.4/blob/main/results_2024-02-09T12-18-51.743149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6224226507357447,
"acc_stderr": 0.03300491139206905,
"acc_norm": 0.6261475680953128,
"acc_norm_stderr": 0.03367429602929055,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5420385268751854,
"mc2_stderr": 0.015218334200579092
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735567,
"acc_norm": 0.6356655290102389,
"acc_norm_stderr": 0.014063260279882419
},
"harness|hellaswag|10": {
"acc": 0.6235809599681338,
"acc_stderr": 0.004834969412883641,
"acc_norm": 0.826727743477395,
"acc_norm_stderr": 0.0037770896070954763
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340354,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340354
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.02977384701253297,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.02977384701253297
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010344,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179337,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179337
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001501,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001501
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.02586220185227789,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.02586220185227789
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2927374301675978,
"acc_stderr": 0.015218109544410184,
"acc_norm": 0.2927374301675978,
"acc_norm_stderr": 0.015218109544410184
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427906,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427906
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004906,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004906
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868055,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868055
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765846,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765846
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355442,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355442
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5420385268751854,
"mc2_stderr": 0.015218334200579092
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710686
},
"harness|gsm8k|5": {
"acc": 0.47308567096285065,
"acc_stderr": 0.013752517189717465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kuroneko5943/jd21 | ---
annotations_creators:
- found
language:
- zh
language_creators:
- crowdsourced
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: jd21
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- jd
task_categories:
- text-classification
task_ids:
- sentiment-classification
--- |
LFBMS/class_dataset_real2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': bilanz_h
'1': bilanz_v
'2': guv
'3': kontennachweis_bilanz
'4': kontennachweis_guv
'5': other
splits:
- name: train
num_bytes: 345218235.409
num_examples: 1117
- name: test
num_bytes: 87105530.0
num_examples: 280
download_size: 400622867
dataset_size: 432323765.409
---
# Dataset Card for "class_dataset_real2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NghiemAbe/QQP_triplet | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 24975429
num_examples: 101762
download_size: 13187331
dataset_size: 24975429
task_categories:
- sentence-similarity
language:
- vi
---
# Dataset Card for "QQP_triplet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_codeparrot__codeparrot | ---
pretty_name: Evaluation run of codeparrot/codeparrot
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codeparrot/codeparrot](https://huggingface.co/codeparrot/codeparrot) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codeparrot__codeparrot\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T22:03:29.134706](https://huggingface.co/datasets/open-llm-leaderboard/details_codeparrot__codeparrot/blob/main/results_2023-10-27T22-03-29.134706.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.00027736144573357115,\n \"f1\": 0.020442533557047064,\n\
\ \"f1_stderr\": 0.0007057901378550561,\n \"acc\": 0.252123807648879,\n\
\ \"acc_stderr\": 0.007682267037046532\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573357115,\n\
\ \"f1\": 0.020442533557047064,\n \"f1_stderr\": 0.0007057901378550561\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674346\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225629\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codeparrot/codeparrot
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T22_03_29.134706
path:
- '**/details_harness|drop|3_2023-10-27T22-03-29.134706.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T22-03-29.134706.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T22_03_29.134706
path:
- '**/details_harness|gsm8k|5_2023-10-27T22-03-29.134706.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T22-03-29.134706.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T22_03_29.134706
path:
- '**/details_harness|winogrande|5_2023-10-27T22-03-29.134706.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T22-03-29.134706.parquet'
- config_name: results
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- results_2023-09-21T22-35-18.428619.parquet
- split: 2023_10_27T22_03_29.134706
path:
- results_2023-10-27T22-03-29.134706.parquet
- split: latest
path:
- results_2023-10-27T22-03-29.134706.parquet
---
# Dataset Card for Evaluation run of codeparrot/codeparrot
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codeparrot/codeparrot
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codeparrot/codeparrot](https://huggingface.co/codeparrot/codeparrot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codeparrot__codeparrot",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T22:03:29.134706](https://huggingface.co/datasets/open-llm-leaderboard/details_codeparrot__codeparrot/blob/main/results_2023-10-27T22-03-29.134706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573357115,
"f1": 0.020442533557047064,
"f1_stderr": 0.0007057901378550561,
"acc": 0.252123807648879,
"acc_stderr": 0.007682267037046532
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573357115,
"f1": 0.020442533557047064,
"f1_stderr": 0.0007057901378550561
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674346
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225629
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-anatomy-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 5735
num_examples: 5
- name: test
num_bytes: 860423
num_examples: 135
download_size: 130157
dataset_size: 866158
---
# Dataset Card for "mmlu-anatomy-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xwjiang2010/simple_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
splits:
- name: train
num_bytes: 14
num_examples: 2
download_size: 760
dataset_size: 14
---
# Dataset Card for "simple_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_house_16H_gosdt_l512_d3_sd3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 9224800000
num_examples: 100000
- name: validation
num_bytes: 922480000
num_examples: 10000
download_size: 3198840988
dataset_size: 10147280000
---
# Dataset Card for "autotree_automl_house_16H_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thobauma/harmless-eval-chuela2502 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: clean
num_bytes: 3177260
num_examples: 2312
- name: poisoned
num_bytes: 3228553
num_examples: 2312
download_size: 3548158
dataset_size: 6405813
configs:
- config_name: default
data_files:
- split: clean
path: data/clean-*
- split: poisoned
path: data/poisoned-*
---
|
open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-70B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T11:41:03.022396](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble/blob/main/results_2023-09-14T11-41-03.022396.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6934330265245879,\n\
\ \"acc_stderr\": 0.031312838620430335,\n \"acc_norm\": 0.697335554746802,\n\
\ \"acc_norm_stderr\": 0.03128337547678218,\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.01746084997587397,\n \"mc2\": 0.6479539766332348,\n\
\ \"mc2_stderr\": 0.014916593992436448\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n\
\ \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6707827126070504,\n\
\ \"acc_stderr\": 0.00468968597815517,\n \"acc_norm\": 0.867755427205736,\n\
\ \"acc_norm_stderr\": 0.0033806414709899157\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.040287315329475576,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.040287315329475576\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493844,\n \"\
acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493844\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.021417242936321582,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.021417242936321582\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8862385321100917,\n \"acc_stderr\": 0.013613614800232805,\n \"\
acc_norm\": 0.8862385321100917,\n \"acc_norm_stderr\": 0.013613614800232805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.021328337570804365,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.021328337570804365\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n\
\ \"acc_stderr\": 0.012282876868629234,\n \"acc_norm\": 0.8633461047254151,\n\
\ \"acc_norm_stderr\": 0.012282876868629234\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5743016759776536,\n\
\ \"acc_stderr\": 0.01653682964899712,\n \"acc_norm\": 0.5743016759776536,\n\
\ \"acc_norm_stderr\": 0.01653682964899712\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\
\ \"acc_stderr\": 0.024406162094668907,\n \"acc_norm\": 0.7556270096463023,\n\
\ \"acc_norm_stderr\": 0.024406162094668907\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7993827160493827,\n \"acc_stderr\": 0.02228231394977488,\n\
\ \"acc_norm\": 0.7993827160493827,\n \"acc_norm_stderr\": 0.02228231394977488\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n\
\ \"acc_stderr\": 0.012663412101248345,\n \"acc_norm\": 0.5645371577574967,\n\
\ \"acc_norm_stderr\": 0.012663412101248345\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7418300653594772,\n \"acc_stderr\": 0.017704531653250078,\n \
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.017704531653250078\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.01746084997587397,\n \"mc2\": 0.6479539766332348,\n\
\ \"mc2_stderr\": 0.014916593992436448\n }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|arc:challenge|25_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hellaswag|10_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T11-41-03.022396.parquet'
- config_name: results
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- results_2023-09-14T11-41-03.022396.parquet
- split: latest
path:
- results_2023-09-14T11-41-03.022396.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-70B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T11:41:03.022396](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble/blob/main/results_2023-09-14T11-41-03.022396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6934330265245879,
"acc_stderr": 0.031312838620430335,
"acc_norm": 0.697335554746802,
"acc_norm_stderr": 0.03128337547678218,
"mc1": 0.46511627906976744,
"mc1_stderr": 0.01746084997587397,
"mc2": 0.6479539766332348,
"mc2_stderr": 0.014916593992436448
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.6707827126070504,
"acc_stderr": 0.00468968597815517,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.0033806414709899157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.040287315329475576,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.040287315329475576
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.025690321762493844,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.025690321762493844
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321582,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321582
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8862385321100917,
"acc_stderr": 0.013613614800232805,
"acc_norm": 0.8862385321100917,
"acc_norm_stderr": 0.013613614800232805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.021328337570804365,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.021328337570804365
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8633461047254151,
"acc_stderr": 0.012282876868629234,
"acc_norm": 0.8633461047254151,
"acc_norm_stderr": 0.012282876868629234
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5743016759776536,
"acc_stderr": 0.01653682964899712,
"acc_norm": 0.5743016759776536,
"acc_norm_stderr": 0.01653682964899712
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.024406162094668907,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.024406162094668907
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7993827160493827,
"acc_stderr": 0.02228231394977488,
"acc_norm": 0.7993827160493827,
"acc_norm_stderr": 0.02228231394977488
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248345,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.017704531653250078,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.017704531653250078
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46511627906976744,
"mc1_stderr": 0.01746084997587397,
"mc2": 0.6479539766332348,
"mc2_stderr": 0.014916593992436448
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
grsilva/rebel_portuguese | ---
license: mit
language:
- pt
pretty_name: rebel_pt
---
This is a dataset that was created to re-train [REBEL](https://github.com/Babelscape/rebel) to work better for the Portuguese language.
This dataset was generated using [CROCODILE](https://github.com/Babelscape/crocodile), which was adapted to use a Portuguese specific model (pt_core_news_sm) instead of their default multi-language model (xx_ent_wiki_sm).
The dataset comes with a train, test, dev and train_dev splits. The train_dev split accounts for 80% of the dataset with the remaining 20% being the training data. The train and dev split was generated from the 80% train_dev data which was further split into an 80/20.
The split for the dataset ends up being:
* Train_dev -> 80% of the data
* Test -> 20% of the data
* Train -> 64% of the data
* Dev -> 16% of the data |
vitaliy-sharandin/pollution-absolute-variation-co2 | ---
dataset_info:
features:
- name: Entity
dtype: string
- name: Code
dtype: string
- name: Annual CO₂ emissions growth (abs)
dtype: float64
- name: Year
dtype: timestamp[ns, tz=UTC]
- name: dt
dtype: timestamp[ns, tz=UTC]
splits:
- name: train
num_bytes: 1295730
num_examples: 28944
download_size: 350866
dataset_size: 1295730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pollution-absolute-variation-co2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heegyu/korean-petitions | ---
license: mit
---
# 청와대 국민청원
데이터 출처: https://github.com/lovit/petitions_archive<br/>
크기: 651.8MB
sample
```
{
"category": "반려동물",
"begin": "2017-08-25",
"end": "2017-11-23",
"content": "길고양이들 밥주고있는 사람입니다. 최근에 동네주민과 트러블이 생겨 싸움이 일어났습니다. 길고양이들이 모여든다고 밥주지마라고 윽박지르셨습니다. 쓰레기봉투를 뜯는다거나 사람에게 해끼치거나 하지 않았습니다. 단순히 고양이가 모여드는게 싫답니다. 그럼 애들은 굶어죽어야하나요? 길고양이들이 맘놓고 쉬고 밥먹을 수 있는 환경이 전혀 없는데 무작정 밥안주고 물 안주면 얘네는 어떻게 하나요? 안그래도 수명도 짧은데다가 길고양이를 상대로 학대하는 사람들도 많은데 너무 가엾습니다. 강동구청은 고양이 급식소라고 만들어주셨던데 동네마다 한개씩이라도 만들어 주셨으면좋겠어요.. 밥에다가 이상한짓 하는 사람 있을 수 있으니까 cctv도 설치도 해주셨으면 합니다.. (급식소에 쥐약을 뿌려 고양이가 죽은 사례가 있습니다) 지구가 사람껀 아니잖아요 동물과도 더불어 살줄 알아야죠 문대통령님께서 동물복지 관련 공략을 내셨지만 나아진게 전혀 없는거같아요. 공략 꼭 지켜주세요.. 믿고 뽑았는데 전혀 나아지고 바뀐게 없으면 너무 실망스럽잖아요.. 그리고 길고양이뿐만 아니라 다른 동물 학대하는 부분도 처벌 강화 부탁드립니다",
"num_agree": 5,
"petition_idx": "513",
"status": "청원종료",
"title": "길고양이를 도와주세요"
}
``` |
Isamu136/big-animal-dataset-high-res-embedding-with-hidden-states | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: l14_embeddings
sequence: float32
- name: moco_vitb_imagenet_embeddings
sequence: float32
- name: ibot_b_16_embedding
sequence: float32
- name: ibot_b_16_last_self_attn
sequence: float32
- name: midas_dpt_swin2_large_384
dtype: image
- name: subject_noun
dtype: string
- name: moco_vitb_imagenet_embeddings_without_last_layer
sequence: float32
- name: moco_vitb_imagenet_hidden_state
sequence:
sequence: float32
splits:
- name: train
num_bytes: 19608883787.94
num_examples: 26180
download_size: 17552223513
dataset_size: 19608883787.94
---
# Dataset Card for "big-animal-dataset-high-res-embedding-with-hidden-states"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vicgalle/alpaca-gpt4 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 88566301
num_examples: 52002
download_size: 48393562
dataset_size: 88566301
task_categories:
- text-generation
- conversational
- question-answering
language:
- en
size_categories:
- 10K<n<100K
license: cc-by-nc-4.0
tags:
- gpt4
- alpaca
- instruction-finetuning
- synthetic
---
# Dataset Card for "alpaca-gpt4"
This dataset contains English Instruction-Following generated by GPT-4 using Alpaca prompts for fine-tuning LLMs.
The dataset was originaly shared in this repository: https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM. This is just a wraper for compatibility with huggingface's datasets library.
## Dataset Description
- **Homepage:** https://instruction-tuning-with-gpt-4.github.io
- **Repository:** https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM
- **Paper:** https://arxiv.org/abs/2304.03277
## Dataset structure
It contains 52K instruction-following data generated by GPT-4 using the same prompts as in Alpaca.
The dataset has the same format as Alpaca data, except the output is generated by GPT-4:
- `instruction`: `str`, describes the task the model should perform. Each of the 52K instructions is unique.
- `input`: `str`, optional context or input for the task.
- `output`: `str`, the answer to the instruction as generated by `GPT-4`.
- `text`: `str`, all the previous fields concatenated together, plus the same prompt used in Alpaca at the beginnig.
## Difference with the original Alpaca dataset
The original Alpaca dataset used text-davinci-003 to complete the prompts. This dataset uses those same prompts, but generating the completions with GPT-4. Thus, in general, the responses are of higher quality and lenght. Here is an example:
#### Example from Alpaca-GPT4:
```bash
{'instruction': 'Identify the odd one out.',
'input': 'Twitter, Instagram, Telegram',
'output': 'The odd one out is Telegram. Twitter and Instagram are social media platforms mainly for sharing information, images and videos while Telegram is a cloud-based instant messaging and voice-over-IP service.',
'text': 'Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nIdentify the odd one out.\n\n### Input:\nTwitter, Instagram, Telegram\n\n### Response:\nThe odd one out is Telegram. Twitter and Instagram are social media platforms mainly for sharing information, images and videos while Telegram is a cloud-based instant messaging and voice-over-IP service.'}
```
#### Same example from original Alpaca:
```bash
{'instruction': 'Identify the odd one out.',
'input': 'Twitter, Instagram, Telegram',
'output': 'Telegram',
'text': 'Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nIdentify the odd one out.\n\n### Input:\nTwitter, Instagram, Telegram\n\n### Response:\nTelegram'}
```
## Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode). |
liuyanchen1015/MULTI_VALUE_sst2_after_perfect | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3391
num_examples: 23
- name: test
num_bytes: 9202
num_examples: 57
- name: train
num_bytes: 120178
num_examples: 844
download_size: 62187
dataset_size: 132771
---
# Dataset Card for "MULTI_VALUE_sst2_after_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0xAnders/ama-bot | ---
license: apache-2.0
---
|
Nan-Do/code-search-net-go | ---
dataset_info:
features:
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
- name: partition
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 833011518
num_examples: 345890
download_size: 239636894
dataset_size: 833011518
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
- summarization
language:
- en
tags:
- code
- go
- CodeSearchNet
- summary
pretty_name: Go CodeSearchNet with Summaries
---
# Dataset Card for "code-search-net-go"
## Dataset Description
- **Homepage:** None
- **Repository:** https://huggingface.co/datasets/Nan-Do/code-search-net-go
- **Paper:** None
- **Leaderboard:** None
- **Point of Contact:** [@Nan-Do](https://github.com/Nan-Do)
### Dataset Summary
This dataset is the Go portion of the CodeSarchNet annotated with a summary column.
The code-search-net dataset includes open source functions that include comments found at GitHub.
The summary is a short description of what the function does.
### Languages
The dataset's comments are in English and the functions are coded in Go
### Data Splits
Train, test, validation labels are included in the dataset as a column.
## Dataset Creation
May of 2023
### Curation Rationale
This dataset can be used to generate instructional (or many other interesting) datasets that are useful to train LLMs
### Source Data
The CodeSearchNet dataset can be found at https://www.kaggle.com/datasets/omduggineni/codesearchnet
### Annotations
This datasets include a summary column including a short description of the function.
#### Annotation process
The annotation procedure was done using [Salesforce](https://huggingface.co/Salesforce) T5 summarization models.
A sample notebook of the process can be found at https://github.com/Nan-Do/OpenAssistantInstructionResponsePython
The annontations have been cleaned to make sure there are no repetitions and/or meaningless summaries. (some may still be present in the dataset)
### Licensing Information
Apache 2.0 |
jlbaker361/prior-cold | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: man
dtype: image
- name: woman
dtype: image
- name: boy
dtype: image
- name: girl
dtype: image
- name: character
dtype: image
- name: person
dtype: image
splits:
- name: train
num_bytes: 120130011.0
num_examples: 42
download_size: 120140088
dataset_size: 120130011.0
---
flavor: cold
num_inference_steps: 30
|
GannaHelal/dataset1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 19957.0
num_examples: 3
download_size: 0
dataset_size: 19957.0
---
# Dataset Card for "dataset1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_v5-mathemak-0d489a-2053267106 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_v5
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-125m_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_v5
dataset_config: mathemakitten--winobias_antistereotype_test_v5
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-125m_eval
* Dataset: mathemakitten/winobias_antistereotype_test_v5
* Config: mathemakitten--winobias_antistereotype_test_v5
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
irds/neumarco_zh_dev_judged | ---
pretty_name: '`neumarco/zh/dev/judged`'
viewer: false
source_datasets: ['irds/neumarco_zh', 'irds/neumarco_zh_dev']
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/zh/dev/judged`
The `neumarco/zh/dev/judged` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/zh/dev/judged).
# Data
This dataset provides:
- `queries` (i.e., topics); count=55,578
- For `docs`, use [`irds/neumarco_zh`](https://huggingface.co/datasets/irds/neumarco_zh)
- For `qrels`, use [`irds/neumarco_zh_dev`](https://huggingface.co/datasets/irds/neumarco_zh_dev)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/neumarco_zh_dev_judged', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
result-kand2-sdxl-wuerst-karlo/b50562e5 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 170
num_examples: 10
download_size: 1334
dataset_size: 170
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "b50562e5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_a_ing | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 124519
num_examples: 803
- name: test
num_bytes: 88945
num_examples: 661
- name: train
num_bytes: 348996
num_examples: 2396
download_size: 320431
dataset_size: 562460
---
# Dataset Card for "MULTI_VALUE_stsb_a_ing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gummybear05/Y_normal | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 9454815261
num_examples: 12401
- name: test
num_bytes: 504186814
num_examples: 605
download_size: 2181835743
dataset_size: 9959002075
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
xnli | ---
language:
- ar
- bg
- de
- el
- en
- es
- fr
- hi
- ru
- sw
- th
- tr
- ur
- vi
- zh
paperswithcode_id: xnli
pretty_name: Cross-lingual Natural Language Inference
dataset_info:
- config_name: all_languages
features:
- name: premise
dtype:
translation:
languages:
- ar
- bg
- de
- el
- en
- es
- fr
- hi
- ru
- sw
- th
- tr
- ur
- vi
- zh
- name: hypothesis
dtype:
translation_variable_languages:
languages:
- ar
- bg
- de
- el
- en
- es
- fr
- hi
- ru
- sw
- th
- tr
- ur
- vi
- zh
num_languages: 15
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 1581471691
num_examples: 392702
- name: test
num_bytes: 19387432
num_examples: 5010
- name: validation
num_bytes: 9566179
num_examples: 2490
download_size: 963942271
dataset_size: 1610425302
- config_name: ar
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 107399614
num_examples: 392702
- name: test
num_bytes: 1294553
num_examples: 5010
- name: validation
num_bytes: 633001
num_examples: 2490
download_size: 59215902
dataset_size: 109327168
- config_name: bg
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 125973225
num_examples: 392702
- name: test
num_bytes: 1573034
num_examples: 5010
- name: validation
num_bytes: 774061
num_examples: 2490
download_size: 66117878
dataset_size: 128320320
- config_name: de
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 84684140
num_examples: 392702
- name: test
num_bytes: 996488
num_examples: 5010
- name: validation
num_bytes: 494604
num_examples: 2490
download_size: 55973883
dataset_size: 86175232
- config_name: el
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 139753358
num_examples: 392702
- name: test
num_bytes: 1704785
num_examples: 5010
- name: validation
num_bytes: 841226
num_examples: 2490
download_size: 74551247
dataset_size: 142299369
- config_name: en
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 74444026
num_examples: 392702
- name: test
num_bytes: 875134
num_examples: 5010
- name: validation
num_bytes: 433463
num_examples: 2490
download_size: 50627367
dataset_size: 75752623
- config_name: es
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 81383284
num_examples: 392702
- name: test
num_bytes: 969813
num_examples: 5010
- name: validation
num_bytes: 478422
num_examples: 2490
download_size: 53677157
dataset_size: 82831519
- config_name: fr
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 85808779
num_examples: 392702
- name: test
num_bytes: 1029239
num_examples: 5010
- name: validation
num_bytes: 510104
num_examples: 2490
download_size: 55968680
dataset_size: 87348122
- config_name: hi
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 170593964
num_examples: 392702
- name: test
num_bytes: 2073073
num_examples: 5010
- name: validation
num_bytes: 1023915
num_examples: 2490
download_size: 70908548
dataset_size: 173690952
- config_name: ru
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 129859615
num_examples: 392702
- name: test
num_bytes: 1603466
num_examples: 5010
- name: validation
num_bytes: 786442
num_examples: 2490
download_size: 70702606
dataset_size: 132249523
- config_name: sw
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 69285725
num_examples: 392702
- name: test
num_bytes: 871651
num_examples: 5010
- name: validation
num_bytes: 429850
num_examples: 2490
download_size: 45564152
dataset_size: 70587226
- config_name: th
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 176062892
num_examples: 392702
- name: test
num_bytes: 2147015
num_examples: 5010
- name: validation
num_bytes: 1061160
num_examples: 2490
download_size: 77222045
dataset_size: 179271067
- config_name: tr
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 71637140
num_examples: 392702
- name: test
num_bytes: 934934
num_examples: 5010
- name: validation
num_bytes: 459308
num_examples: 2490
download_size: 48509680
dataset_size: 73031382
- config_name: ur
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 96441486
num_examples: 392702
- name: test
num_bytes: 1416241
num_examples: 5010
- name: validation
num_bytes: 699952
num_examples: 2490
download_size: 46682785
dataset_size: 98557679
- config_name: vi
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 101417430
num_examples: 392702
- name: test
num_bytes: 1190217
num_examples: 5010
- name: validation
num_bytes: 590680
num_examples: 2490
download_size: 57690058
dataset_size: 103198327
- config_name: zh
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 72224841
num_examples: 392702
- name: test
num_bytes: 777929
num_examples: 5010
- name: validation
num_bytes: 384851
num_examples: 2490
download_size: 48269855
dataset_size: 73387621
configs:
- config_name: all_languages
data_files:
- split: train
path: all_languages/train-*
- split: test
path: all_languages/test-*
- split: validation
path: all_languages/validation-*
- config_name: ar
data_files:
- split: train
path: ar/train-*
- split: test
path: ar/test-*
- split: validation
path: ar/validation-*
- config_name: bg
data_files:
- split: train
path: bg/train-*
- split: test
path: bg/test-*
- split: validation
path: bg/validation-*
- config_name: de
data_files:
- split: train
path: de/train-*
- split: test
path: de/test-*
- split: validation
path: de/validation-*
- config_name: el
data_files:
- split: train
path: el/train-*
- split: test
path: el/test-*
- split: validation
path: el/validation-*
- config_name: en
data_files:
- split: train
path: en/train-*
- split: test
path: en/test-*
- split: validation
path: en/validation-*
- config_name: es
data_files:
- split: train
path: es/train-*
- split: test
path: es/test-*
- split: validation
path: es/validation-*
- config_name: fr
data_files:
- split: train
path: fr/train-*
- split: test
path: fr/test-*
- split: validation
path: fr/validation-*
- config_name: hi
data_files:
- split: train
path: hi/train-*
- split: test
path: hi/test-*
- split: validation
path: hi/validation-*
- config_name: ru
data_files:
- split: train
path: ru/train-*
- split: test
path: ru/test-*
- split: validation
path: ru/validation-*
- config_name: sw
data_files:
- split: train
path: sw/train-*
- split: test
path: sw/test-*
- split: validation
path: sw/validation-*
- config_name: th
data_files:
- split: train
path: th/train-*
- split: test
path: th/test-*
- split: validation
path: th/validation-*
- config_name: tr
data_files:
- split: train
path: tr/train-*
- split: test
path: tr/test-*
- split: validation
path: tr/validation-*
- config_name: ur
data_files:
- split: train
path: ur/train-*
- split: test
path: ur/test-*
- split: validation
path: ur/validation-*
- config_name: vi
data_files:
- split: train
path: vi/train-*
- split: test
path: vi/test-*
- split: validation
path: vi/validation-*
- config_name: zh
data_files:
- split: train
path: zh/train-*
- split: test
path: zh/test-*
- split: validation
path: zh/validation-*
---
# Dataset Card for "xnli"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://www.nyu.edu/projects/bowman/xnli/](https://www.nyu.edu/projects/bowman/xnli/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 7.74 GB
- **Size of the generated dataset:** 3.23 GB
- **Total amount of disk used:** 10.97 GB
### Dataset Summary
XNLI is a subset of a few thousand examples from MNLI which has been translated
into a 14 different languages (some low-ish resource). As with MNLI, the goal is
to predict textual entailment (does sentence A imply/contradict/neither sentence
B) and is a classification task (given two sentences, predict one of three
labels).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### all_languages
- **Size of downloaded dataset files:** 483.96 MB
- **Size of the generated dataset:** 1.61 GB
- **Total amount of disk used:** 2.09 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"hypothesis": "{\"language\": [\"ar\", \"bg\", \"de\", \"el\", \"en\", \"es\", \"fr\", \"hi\", \"ru\", \"sw\", \"th\", \"tr\", \"ur\", \"vi\", \"zh\"], \"translation\": [\"احد اع...",
"label": 0,
"premise": "{\"ar\": \"واحدة من رقابنا ستقوم بتنفيذ تعليماتك كلها بكل دقة\", \"bg\": \"един от нашите номера ще ви даде инструкции .\", \"de\": \"Eine ..."
}
```
#### ar
- **Size of downloaded dataset files:** 483.96 MB
- **Size of the generated dataset:** 109.32 MB
- **Total amount of disk used:** 593.29 MB
An example of 'validation' looks as follows.
```
{
"hypothesis": "اتصل بأمه حالما أوصلته حافلة المدرسية.",
"label": 1,
"premise": "وقال، ماما، لقد عدت للمنزل."
}
```
#### bg
- **Size of downloaded dataset files:** 483.96 MB
- **Size of the generated dataset:** 128.32 MB
- **Total amount of disk used:** 612.28 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"hypothesis": "\"губиш нещата на следното ниво , ако хората си припомнят .\"...",
"label": 0,
"premise": "\"по време на сезона и предполагам , че на твоето ниво ще ги загубиш на следващото ниво , ако те решат да си припомнят отбора на ..."
}
```
#### de
- **Size of downloaded dataset files:** 483.96 MB
- **Size of the generated dataset:** 86.17 MB
- **Total amount of disk used:** 570.14 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"hypothesis": "Man verliert die Dinge auf die folgende Ebene , wenn sich die Leute erinnern .",
"label": 0,
"premise": "\"Du weißt , während der Saison und ich schätze , auf deiner Ebene verlierst du sie auf die nächste Ebene , wenn sie sich entschl..."
}
```
#### el
- **Size of downloaded dataset files:** 483.96 MB
- **Size of the generated dataset:** 142.30 MB
- **Total amount of disk used:** 626.26 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"hypothesis": "\"Τηλεφώνησε στη μαμά του μόλις το σχολικό λεωφορείο τον άφησε.\"...",
"label": 1,
"premise": "Και είπε, Μαμά, έφτασα στο σπίτι."
}
```
### Data Fields
The data fields are the same among all splits.
#### all_languages
- `premise`: a multilingual `string` variable, with possible languages including `ar`, `bg`, `de`, `el`, `en`.
- `hypothesis`: a multilingual `string` variable, with possible languages including `ar`, `bg`, `de`, `el`, `en`.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
#### ar
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
#### bg
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
#### de
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
#### el
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
### Data Splits
| name |train |validation|test|
|-------------|-----:|---------:|---:|
|all_languages|392702| 2490|5010|
|ar |392702| 2490|5010|
|bg |392702| 2490|5010|
|de |392702| 2490|5010|
|el |392702| 2490|5010|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{conneau2018xnli,
author = {Conneau, Alexis
and Rinott, Ruty
and Lample, Guillaume
and Williams, Adina
and Bowman, Samuel R.
and Schwenk, Holger
and Stoyanov, Veselin},
title = {XNLI: Evaluating Cross-lingual Sentence Representations},
booktitle = {Proceedings of the 2018 Conference on Empirical Methods
in Natural Language Processing},
year = {2018},
publisher = {Association for Computational Linguistics},
location = {Brussels, Belgium},
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
edumunozsala/counter-hate-speech-es | ---
dataset_info:
features:
- name: HS
dtype: string
- name: CN
dtype: string
splits:
- name: train
num_bytes: 1067088
num_examples: 3572
download_size: 619817
dataset_size: 1067088
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_AA051615__L0225 | ---
pretty_name: Evaluation run of AA051615/L0225
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051615/L0225](https://huggingface.co/AA051615/L0225) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051615__L0225\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T05:21:12.964101](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__L0225/blob/main/results_2024-03-01T05-21-12.964101.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8192856560436336,\n\
\ \"acc_stderr\": 0.02522551022169272,\n \"acc_norm\": 0.8278237600056262,\n\
\ \"acc_norm_stderr\": 0.025630597996564967,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5419063822595955,\n\
\ \"mc2_stderr\": 0.015465200826091909\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n\
\ \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971451\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6269667396932882,\n\
\ \"acc_stderr\": 0.004826224784850442,\n \"acc_norm\": 0.8273252340171281,\n\
\ \"acc_norm_stderr\": 0.003771934042799158\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7925925925925926,\n\
\ \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.7925925925925926,\n\
\ \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.025648341251693598,\n\
\ \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.025648341251693598\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.83,\n\
\ \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8905660377358491,\n \"acc_stderr\": 0.019213530010965436,\n\
\ \"acc_norm\": 0.8905660377358491,\n \"acc_norm_stderr\": 0.019213530010965436\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9583333333333334,\n\
\ \"acc_stderr\": 0.016710315802959983,\n \"acc_norm\": 0.9583333333333334,\n\
\ \"acc_norm_stderr\": 0.016710315802959983\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.815028901734104,\n\
\ \"acc_stderr\": 0.029605623981771197,\n \"acc_norm\": 0.815028901734104,\n\
\ \"acc_norm_stderr\": 0.029605623981771197\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.89,\n \"acc_stderr\": 0.03144660377352201,\n \"acc_norm\": 0.89,\n\
\ \"acc_norm_stderr\": 0.03144660377352201\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8382978723404255,\n \"acc_stderr\": 0.024068505289695338,\n\
\ \"acc_norm\": 0.8382978723404255,\n \"acc_norm_stderr\": 0.024068505289695338\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6842105263157895,\n\
\ \"acc_stderr\": 0.043727482902780085,\n \"acc_norm\": 0.6842105263157895,\n\
\ \"acc_norm_stderr\": 0.043727482902780085\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8620689655172413,\n \"acc_stderr\": 0.028735632183908073,\n\
\ \"acc_norm\": 0.8620689655172413,\n \"acc_norm_stderr\": 0.028735632183908073\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7936507936507936,\n \"acc_stderr\": 0.02084229093011468,\n \"\
acc_norm\": 0.7936507936507936,\n \"acc_norm_stderr\": 0.02084229093011468\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6190476190476191,\n\
\ \"acc_stderr\": 0.04343525428949099,\n \"acc_norm\": 0.6190476190476191,\n\
\ \"acc_norm_stderr\": 0.04343525428949099\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9483870967741935,\n \"acc_stderr\": 0.012586144774300194,\n \"\
acc_norm\": 0.9483870967741935,\n \"acc_norm_stderr\": 0.012586144774300194\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.7389162561576355,\n \"acc_stderr\": 0.030903796952114468,\n \"\
acc_norm\": 0.7389162561576355,\n \"acc_norm_stderr\": 0.030903796952114468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\"\
: 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.022448399923854286,\n\
\ \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.022448399923854286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9545454545454546,\n \"acc_stderr\": 0.014840681800540868,\n \"\
acc_norm\": 0.9545454545454546,\n \"acc_norm_stderr\": 0.014840681800540868\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084352,\n\
\ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084352\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.882051282051282,\n \"acc_stderr\": 0.016353801778303395,\n \
\ \"acc_norm\": 0.882051282051282,\n \"acc_norm_stderr\": 0.016353801778303395\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.5851851851851851,\n \"acc_stderr\": 0.030039842454069283,\n \
\ \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.030039842454069283\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.9243697478991597,\n \"acc_stderr\": 0.01717498881493851,\n \
\ \"acc_norm\": 0.9243697478991597,\n \"acc_norm_stderr\": 0.01717498881493851\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.6158940397350994,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.6158940397350994,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9486238532110092,\n \"acc_stderr\": 0.009465168181022976,\n \"\
acc_norm\": 0.9486238532110092,\n \"acc_norm_stderr\": 0.009465168181022976\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.946078431372549,\n \"acc_stderr\": 0.015852465281106922,\n\
\ \"acc_norm\": 0.946078431372549,\n \"acc_norm_stderr\": 0.015852465281106922\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370174,\n \
\ \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370174\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.852017937219731,\n\
\ \"acc_stderr\": 0.023831557157613533,\n \"acc_norm\": 0.852017937219731,\n\
\ \"acc_norm_stderr\": 0.023831557157613533\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9504132231404959,\n \"acc_stderr\": 0.019817485633523632,\n \"\
acc_norm\": 0.9504132231404959,\n \"acc_norm_stderr\": 0.019817485633523632\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665168,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665168\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9386503067484663,\n \"acc_stderr\": 0.01885387414579323,\n\
\ \"acc_norm\": 0.9386503067484663,\n \"acc_norm_stderr\": 0.01885387414579323\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n\
\ \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.6607142857142857,\n\
\ \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9320388349514563,\n \"acc_stderr\": 0.02491995914251447,\n\
\ \"acc_norm\": 0.9320388349514563,\n \"acc_norm_stderr\": 0.02491995914251447\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n\
\ \"acc_stderr\": 0.014450181176872742,\n \"acc_norm\": 0.9487179487179487,\n\
\ \"acc_norm_stderr\": 0.014450181176872742\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466143,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466143\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9399744572158365,\n\
\ \"acc_stderr\": 0.008494204207108457,\n \"acc_norm\": 0.9399744572158365,\n\
\ \"acc_norm_stderr\": 0.008494204207108457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n\
\ \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8513966480446927,\n\
\ \"acc_stderr\": 0.011896289146701147,\n \"acc_norm\": 0.8513966480446927,\n\
\ \"acc_norm_stderr\": 0.011896289146701147\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8921568627450981,\n \"acc_stderr\": 0.0177609809027895,\n\
\ \"acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.0177609809027895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8971061093247589,\n\
\ \"acc_stderr\": 0.017255830051445344,\n \"acc_norm\": 0.8971061093247589,\n\
\ \"acc_norm_stderr\": 0.017255830051445344\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062072,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062072\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.7092198581560284,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.7092198581560284,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.7737940026075619,\n\
\ \"acc_stderr\": 0.010685470750077789,\n \"acc_norm\": 0.7737940026075619,\n\
\ \"acc_norm_stderr\": 0.010685470750077789\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.9227941176470589,\n \"acc_stderr\": 0.01621410416082776,\n\
\ \"acc_norm\": 0.9227941176470589,\n \"acc_norm_stderr\": 0.01621410416082776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8676470588235294,\n \"acc_stderr\": 0.013709377734592326,\n \
\ \"acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.013709377734592326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8090909090909091,\n\
\ \"acc_stderr\": 0.03764425585984927,\n \"acc_norm\": 0.8090909090909091,\n\
\ \"acc_norm_stderr\": 0.03764425585984927\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8693877551020408,\n \"acc_stderr\": 0.02157266469900928,\n\
\ \"acc_norm\": 0.8693877551020408,\n \"acc_norm_stderr\": 0.02157266469900928\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9253731343283582,\n\
\ \"acc_stderr\": 0.018581939698490618,\n \"acc_norm\": 0.9253731343283582,\n\
\ \"acc_norm_stderr\": 0.018581939698490618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759036,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759036\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6566265060240963,\n\
\ \"acc_stderr\": 0.03696584317010602,\n \"acc_norm\": 0.6566265060240963,\n\
\ \"acc_norm_stderr\": 0.03696584317010602\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9298245614035088,\n \"acc_stderr\": 0.019591541754525123,\n\
\ \"acc_norm\": 0.9298245614035088,\n \"acc_norm_stderr\": 0.019591541754525123\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5419063822595955,\n\
\ \"mc2_stderr\": 0.015465200826091909\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090255\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5951478392721758,\n \
\ \"acc_stderr\": 0.013520817666870516\n }\n}\n```"
repo_url: https://huggingface.co/AA051615/L0225
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|arc:challenge|25_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|gsm8k|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hellaswag|10_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-21-12.964101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T05-21-12.964101.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- '**/details_harness|winogrande|5_2024-03-01T05-21-12.964101.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T05-21-12.964101.parquet'
- config_name: results
data_files:
- split: 2024_03_01T05_21_12.964101
path:
- results_2024-03-01T05-21-12.964101.parquet
- split: latest
path:
- results_2024-03-01T05-21-12.964101.parquet
---
# Dataset Card for Evaluation run of AA051615/L0225
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051615/L0225](https://huggingface.co/AA051615/L0225) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051615__L0225",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T05:21:12.964101](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__L0225/blob/main/results_2024-03-01T05-21-12.964101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8192856560436336,
"acc_stderr": 0.02522551022169272,
"acc_norm": 0.8278237600056262,
"acc_norm_stderr": 0.025630597996564967,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5419063822595955,
"mc2_stderr": 0.015465200826091909
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971451
},
"harness|hellaswag|10": {
"acc": 0.6269667396932882,
"acc_stderr": 0.004826224784850442,
"acc_norm": 0.8273252340171281,
"acc_norm_stderr": 0.003771934042799158
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7925925925925926,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.7925925925925926,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.025648341251693598,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.025648341251693598
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8905660377358491,
"acc_stderr": 0.019213530010965436,
"acc_norm": 0.8905660377358491,
"acc_norm_stderr": 0.019213530010965436
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9583333333333334,
"acc_stderr": 0.016710315802959983,
"acc_norm": 0.9583333333333334,
"acc_norm_stderr": 0.016710315802959983
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.029605623981771197,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.029605623981771197
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352201,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352201
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8382978723404255,
"acc_stderr": 0.024068505289695338,
"acc_norm": 0.8382978723404255,
"acc_norm_stderr": 0.024068505289695338
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.043727482902780085,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.043727482902780085
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.028735632183908073,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.028735632183908073
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7936507936507936,
"acc_stderr": 0.02084229093011468,
"acc_norm": 0.7936507936507936,
"acc_norm_stderr": 0.02084229093011468
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6190476190476191,
"acc_stderr": 0.04343525428949099,
"acc_norm": 0.6190476190476191,
"acc_norm_stderr": 0.04343525428949099
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9483870967741935,
"acc_stderr": 0.012586144774300194,
"acc_norm": 0.9483870967741935,
"acc_norm_stderr": 0.012586144774300194
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7389162561576355,
"acc_stderr": 0.030903796952114468,
"acc_norm": 0.7389162561576355,
"acc_norm_stderr": 0.030903796952114468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.022448399923854286,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.022448399923854286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9545454545454546,
"acc_stderr": 0.014840681800540868,
"acc_norm": 0.9545454545454546,
"acc_norm_stderr": 0.014840681800540868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084352,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084352
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.882051282051282,
"acc_stderr": 0.016353801778303395,
"acc_norm": 0.882051282051282,
"acc_norm_stderr": 0.016353801778303395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.030039842454069283,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.030039842454069283
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.9243697478991597,
"acc_stderr": 0.01717498881493851,
"acc_norm": 0.9243697478991597,
"acc_norm_stderr": 0.01717498881493851
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6158940397350994,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.6158940397350994,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9486238532110092,
"acc_stderr": 0.009465168181022976,
"acc_norm": 0.9486238532110092,
"acc_norm_stderr": 0.009465168181022976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.75,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.75,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.946078431372549,
"acc_stderr": 0.015852465281106922,
"acc_norm": 0.946078431372549,
"acc_norm_stderr": 0.015852465281106922
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9493670886075949,
"acc_stderr": 0.014271760025370174,
"acc_norm": 0.9493670886075949,
"acc_norm_stderr": 0.014271760025370174
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.852017937219731,
"acc_stderr": 0.023831557157613533,
"acc_norm": 0.852017937219731,
"acc_norm_stderr": 0.023831557157613533
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9504132231404959,
"acc_stderr": 0.019817485633523632,
"acc_norm": 0.9504132231404959,
"acc_norm_stderr": 0.019817485633523632
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665168,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665168
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9386503067484663,
"acc_stderr": 0.01885387414579323,
"acc_norm": 0.9386503067484663,
"acc_norm_stderr": 0.01885387414579323
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.044939490686135404,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.044939490686135404
},
"harness|hendrycksTest-management|5": {
"acc": 0.9320388349514563,
"acc_stderr": 0.02491995914251447,
"acc_norm": 0.9320388349514563,
"acc_norm_stderr": 0.02491995914251447
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872742,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872742
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466143,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466143
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9399744572158365,
"acc_stderr": 0.008494204207108457,
"acc_norm": 0.9399744572158365,
"acc_norm_stderr": 0.008494204207108457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8513966480446927,
"acc_stderr": 0.011896289146701147,
"acc_norm": 0.8513966480446927,
"acc_norm_stderr": 0.011896289146701147
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.0177609809027895,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.0177609809027895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8971061093247589,
"acc_stderr": 0.017255830051445344,
"acc_norm": 0.8971061093247589,
"acc_norm_stderr": 0.017255830051445344
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062072,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062072
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7092198581560284,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.7092198581560284,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.7737940026075619,
"acc_stderr": 0.010685470750077789,
"acc_norm": 0.7737940026075619,
"acc_norm_stderr": 0.010685470750077789
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9227941176470589,
"acc_stderr": 0.01621410416082776,
"acc_norm": 0.9227941176470589,
"acc_norm_stderr": 0.01621410416082776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.013709377734592326,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.013709377734592326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8090909090909091,
"acc_stderr": 0.03764425585984927,
"acc_norm": 0.8090909090909091,
"acc_norm_stderr": 0.03764425585984927
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8693877551020408,
"acc_stderr": 0.02157266469900928,
"acc_norm": 0.8693877551020408,
"acc_norm_stderr": 0.02157266469900928
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9253731343283582,
"acc_stderr": 0.018581939698490618,
"acc_norm": 0.9253731343283582,
"acc_norm_stderr": 0.018581939698490618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759036,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759036
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6566265060240963,
"acc_stderr": 0.03696584317010602,
"acc_norm": 0.6566265060240963,
"acc_norm_stderr": 0.03696584317010602
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9298245614035088,
"acc_stderr": 0.019591541754525123,
"acc_norm": 0.9298245614035088,
"acc_norm_stderr": 0.019591541754525123
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5419063822595955,
"mc2_stderr": 0.015465200826091909
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090255
},
"harness|gsm8k|5": {
"acc": 0.5951478392721758,
"acc_stderr": 0.013520817666870516
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FreedomIntelligence/huatuo26M-testdatasets | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
tags:
- medical
size_categories:
- 1K<n<10K
---
# Dataset Card for huatuo26M-testdatasets
## Dataset Description
- **Homepage: https://www.huatuogpt.cn/**
- **Repository: https://github.com/FreedomIntelligence/Huatuo-26M**
- **Paper: https://arxiv.org/abs/2305.01526**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
We are pleased to announce the release of our evaluation dataset, a subset of the Huatuo-26M. This dataset contains 6,000 entries that we used for Natural Language Generation (NLG) experimentation in our associated research paper.
We encourage researchers and developers to use this evaluation dataset to gauge the performance of their own models. This is not only a chance to assess the accuracy and relevancy of generated responses but also an opportunity to investigate their model's proficiency in understanding and generating complex medical language.
Note: All the data points have been anonymized to protect patient privacy, and they adhere strictly to data protection and privacy regulations.
## Citation
```
@misc{li2023huatuo26m,
title={Huatuo-26M, a Large-scale Chinese Medical QA Dataset},
author={Jianquan Li and Xidong Wang and Xiangbo Wu and Zhiyi Zhang and Xiaolong Xu and Jie Fu and Prayag Tiwari and Xiang Wan and Benyou Wang},
year={2023},
eprint={2305.01526},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
tomibastias/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
monsoonery/voxpopuli_nl_EVAL_pseudo_labelled | ---
dataset_info:
config_name: nl
features:
- name: audio_id
dtype: string
- name: language
dtype:
class_label:
names:
'0': en
'1': de
'2': fr
'3': es
'4': pl
'5': it
'6': ro
'7': hu
'8': cs
'9': nl
'10': fi
'11': hr
'12': sk
'13': sl
'14': et
'15': lt
'16': en_accented
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: raw_text
dtype: string
- name: normalized_text
dtype: string
- name: gender
dtype: string
- name: speaker_id
dtype: string
- name: is_gold_transcript
dtype: bool
- name: accent
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: validation
num_bytes: 638121672.64
num_examples: 1230
download_size: 509816155
dataset_size: 638121672.64
configs:
- config_name: nl
data_files:
- split: validation
path: nl/validation-*
---
|
Samsoup/cosmos_qa | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer0
dtype: string
- name: answer1
dtype: string
- name: answer2
dtype: string
- name: answer3
dtype: string
- name: label
dtype: int32
splits:
- name: train
num_bytes: 17156676
num_examples: 25262
- name: test
num_bytes: 5120580
num_examples: 6963
- name: validation
num_bytes: 2186585
num_examples: 2985
download_size: 12029581
dataset_size: 24463841
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
ShapeNet/shapenetcore-gltf | ---
language:
- en
pretty_name: ShapeNetCore
tags:
- 3D shapes
license: other
extra_gated_heading: Acknowledge license to accept the repository
extra_gated_prompt: >-
To request access to this ShapeNet repo, you will need to provide your **full name** (please provide both your first and last name), the name of your **advisor or the principal investigator (PI)** of your lab (in the PI/Advisor) fields, and the **school or company** that you are affiliated with (the **Affiliation** field).
After requesting access to this ShapeNet repo, you will be considered for access approval.
After access approval, you (the "Researcher") receive permission to use the ShapeNet database (the "Database") at Princeton University and Stanford University. In exchange for being able to join the ShapeNet community and receive such permission, Researcher hereby agrees to the following terms and conditions:
Researcher shall use the Database only for non-commercial research and educational purposes.
Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify Princeton University and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted 3D models that he or she may create from the Database.
Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.
If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
The law of the State of New Jersey shall apply to all disputes under this agreement.
For access to the data, please fill in your **full name** (both first and last name), the name of your **advisor or principal investigator (PI)**, and the name of the **school or company** you are affliated with.
Please actually fill out the fields (DO NOT put the word "Advisor" for PI/Advisor and the word "School" for "Affiliation", please specify the name of your advisor and the name of your school).
extra_gated_fields:
Name: text
PI/Advisor: text
Affiliation: text
Purpose: text
Country: text
I agree to use this dataset for non-commercial use ONLY: checkbox
---
This repository contains ShapeNetCore (v2) in [GLTF](https://en.wikipedia.org/wiki/GlTF) format, a subset of [ShapeNet](https://shapenet.org).
ShapeNetCore is a densely annotated subset of ShapeNet covering 55 common object categories with ~51,300 unique 3D models. Each model in ShapeNetCore are linked to an appropriate synset in [WordNet 3.0](https://wordnet.princeton.edu/).
If you use ShapeNet data, you agree to abide by the [ShapeNet terms of use](https://shapenet.org/terms). You are only allowed to redistribute the data to your research associates and colleagues provided that they first agree to be bound by these terms and conditions.
If you use this data, please cite the main ShapeNet technical report.
```
@techreport{shapenet2015,
title = {{ShapeNet: An Information-Rich 3D Model Repository}},
author = {Chang, Angel X. and Funkhouser, Thomas and Guibas, Leonidas and Hanrahan, Pat and Huang, Qixing and Li, Zimo and Savarese, Silvio and Savva, Manolis and Song, Shuran and Su, Hao and Xiao, Jianxiong and Yi, Li and Yu, Fisher},
number = {arXiv:1512.03012 [cs.GR]},
institution = {Stanford University --- Princeton University --- Toyota Technological Institute at Chicago},
year = {2015}
}
```
For more information, please contact us at shapenetwebmaster@gmail.com and indicate ShapeNetCore v2 in the title of your email.
|
robson2286/VozJoseCarlos | ---
license: openrail
---
|
tyzhu/lmind_nq_train600_eval300_v1_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 68720
num_examples: 600
- name: train_recite_qa
num_bytes: 453011
num_examples: 600
- name: eval_qa
num_bytes: 35277
num_examples: 300
- name: eval_recite_qa
num_bytes: 226920
num_examples: 300
- name: all_docs
num_bytes: 574063
num_examples: 883
- name: all_docs_eval
num_bytes: 573998
num_examples: 883
- name: train
num_bytes: 68720
num_examples: 600
- name: validation
num_bytes: 35277
num_examples: 300
download_size: 1292475
dataset_size: 2035986
---
# Dataset Card for "lmind_nq_train600_eval300_v1_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dyumat/databricks-dolly-5k-rag-split | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 3784361.705549264
num_examples: 4658
- name: test
num_bytes: 210422.8599693558
num_examples: 259
- name: validation
num_bytes: 210422.8599693558
num_examples: 259
download_size: 4924915
dataset_size: 4205207.425487976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
CyberHarem/houjuu_nue_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of houjuu_nue/封獣ぬえ/호쥬누에 (Touhou)
This is the dataset of houjuu_nue/封獣ぬえ/호쥬누에 (Touhou), containing 500 images and their tags.
The core tags of this character are `black_hair, wings, asymmetrical_wings, red_eyes, short_hair, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 608.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 394.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1136 | 741.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 561.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1136 | 961.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/houjuu_nue_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, black_thighhighs, dress, solo, zettai_ryouiki, smile, snake, trident |
| 1 | 7 |  |  |  |  |  | 1girl, black_thighhighs, dress, snake, solo, trident, zettai_ryouiki |
| 2 | 11 |  |  |  |  |  | 1girl, black_thighhighs, dress, solo, zettai_ryouiki, smile, ahoge |
| 3 | 8 |  |  |  |  |  | 1girl, black_dress, black_thighhighs, blue_wings, looking_at_viewer, red_bowtie, red_wings, short_dress, short_sleeves, smile, solo, trident, bangs, center_frills, hair_between_eyes, holding_weapon, simple_background, zettai_ryouiki, ahoge, blush, medium_breasts, white_background, cowboy_shot, pointy_ears, snake, wristband, buttons, closed_mouth, open_mouth, thighs |
| 4 | 9 |  |  |  |  |  | 1girl, black_dress, black_thighhighs, solo, zettai_ryouiki, looking_at_viewer, smile, short_sleeves |
| 5 | 8 |  |  |  |  |  | 1girl, bangs, black_dress, black_thighhighs, center_frills, red_bowtie, red_wings, short_dress, short_sleeves, snake, solo, trident, blue_wings, blush, footwear_bow, full_body, holding_weapon, looking_at_viewer, red_footwear, shoes, closed_mouth, wristband, :d, open_mouth, simple_background, ufo |
| 6 | 9 |  |  |  |  |  | 1girl, black_dress, solo, looking_at_viewer, red_bowtie, simple_background, upper_body, short_sleeves, white_background |
| 7 | 5 |  |  |  |  |  | 1girl, black_thighhighs, pantyshot, solo, blush, white_panties, snake, black_dress, ufo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | dress | solo | zettai_ryouiki | smile | snake | trident | ahoge | black_dress | blue_wings | looking_at_viewer | red_bowtie | red_wings | short_dress | short_sleeves | bangs | center_frills | hair_between_eyes | holding_weapon | simple_background | blush | medium_breasts | white_background | cowboy_shot | pointy_ears | wristband | buttons | closed_mouth | open_mouth | thighs | footwear_bow | full_body | red_footwear | shoes | :d | ufo | upper_body | pantyshot | white_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:-------|:-----------------|:--------|:--------|:----------|:--------|:--------------|:-------------|:--------------------|:-------------|:------------|:--------------|:----------------|:--------|:----------------|:--------------------|:-----------------|:--------------------|:--------|:-----------------|:-------------------|:--------------|:--------------|:------------|:----------|:---------------|:-------------|:---------|:---------------|:------------|:---------------|:--------|:-----|:------|:-------------|:------------|:----------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | X | X | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | X | | | X | X | | X | X | X | X | X | X | X | X | X | | X | X | X | | | | | X | | X | X | | X | X | X | X | X | X | | | |
| 6 | 9 |  |  |  |  |  | X | | | X | | | | | | X | | X | X | | | X | | | | | X | | | X | | | | | | | | | | | | | | X | | |
| 7 | 5 |  |  |  |  |  | X | X | | X | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | X | X |
|
torchgeo/tropical_cyclone | ---
license: cc-by-4.0
---
|
Falah/story44kids_2_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3409
num_examples: 10
download_size: 4787
dataset_size: 3409
---
# Dataset Card for "story44kids_2_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MoonIcee/joaoo | ---
license: openrail
---
|
CyberHarem/gepard_m1_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gepard_m1/ゲパードM1/猎豹M1 (Girls' Frontline)
This is the dataset of gepard_m1/ゲパードM1/猎豹M1 (Girls' Frontline), containing 22 images and their tags.
The core tags of this character are `long_hair, bangs, grey_hair, hair_between_eyes, breasts, yellow_eyes, white_hair, hair_ornament, hairclip, beret, hat, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 24.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gepard_m1_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 14.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gepard_m1_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 42 | 27.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gepard_m1_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 22.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gepard_m1_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 42 | 38.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gepard_m1_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gepard_m1_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, closed_mouth, solo, black_gloves, looking_at_viewer, black_headwear, blush, jacket, military_uniform, simple_background, skirt, white_background, black_thighhighs, boots, brown_eyes, character_name, holding_gun, long_sleeves, neck_ribbon, sitting, sniper_rifle |
| 1 | 7 |  |  |  |  |  | 1girl, solo, closed_mouth, looking_at_viewer, blush, cat, fur-trimmed_jacket, green_jacket, messy_hair, full_body, long_sleeves, pantyhose, twin_braids, alternate_hairstyle, animal, black_skirt, brown_eyes, hair_over_shoulder, holding, no_shoes, off_shoulder, official_alternate_costume, on_side, scarf, sidelocks, turtleneck_sweater, white_shirt |
| 2 | 5 |  |  |  |  |  | 1girl, cleavage, collarbone, looking_at_viewer, solo, bare_shoulders, simple_background, upper_body, parted_lips, white_background, blush, open_clothes, strapless, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | solo | black_gloves | looking_at_viewer | black_headwear | blush | jacket | military_uniform | simple_background | skirt | white_background | black_thighhighs | boots | brown_eyes | character_name | holding_gun | long_sleeves | neck_ribbon | sitting | sniper_rifle | cat | fur-trimmed_jacket | green_jacket | messy_hair | full_body | pantyhose | twin_braids | alternate_hairstyle | animal | black_skirt | hair_over_shoulder | holding | no_shoes | off_shoulder | official_alternate_costume | on_side | scarf | sidelocks | turtleneck_sweater | white_shirt | cleavage | collarbone | bare_shoulders | upper_body | parted_lips | open_clothes | strapless | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------------|:--------------------|:-----------------|:--------|:---------|:-------------------|:--------------------|:--------|:-------------------|:-------------------|:--------|:-------------|:-----------------|:--------------|:---------------|:--------------|:----------|:---------------|:------|:---------------------|:---------------|:-------------|:------------|:------------|:--------------|:----------------------|:---------|:--------------|:---------------------|:----------|:-----------|:---------------|:-----------------------------|:----------|:--------|:------------|:---------------------|:--------------|:-----------|:-------------|:-----------------|:-------------|:--------------|:---------------|:------------|:--------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | | X | | | | | | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
TigerHatKth/metal | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2168819.0
num_examples: 10
download_size: 2170442
dataset_size: 2168819.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
openaccess-ai-collective/28e7808a553e017d2ac590c071596341 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 7885403186
num_examples: 355253
- name: test
num_bytes: 2323013808
num_examples: 104769
download_size: 1868121538
dataset_size: 10208416994
---
# Dataset Card for "28e7808a553e017d2ac590c071596341"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_10000_Diabetes130US_sgosdt_l256_dim7_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 205720000
num_examples: 10000
- name: validation
num_bytes: 205720000
num_examples: 10000
download_size: 46817857
dataset_size: 411440000
---
# Dataset Card for "autotree_automl_10000_Diabetes130US_sgosdt_l256_dim7_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MetalMace/MTG-CardArt | ---
license: mit
---
|
ASdsadasda123/SuperDSDSDSDSD | ---
license: apache-2.0
---
|
benayas/massive_chatgpt_10pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 761162
num_examples: 11514
download_size: 269754
dataset_size: 761162
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cassandraqs/TripReview | ---
dataset_info:
features:
- name: dest
dtype: string
- name: review
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 11266
num_examples: 5
download_size: 21580
dataset_size: 11266
---
# Dataset Card for "TripReview"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1712973406 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 110580
num_examples: 289
download_size: 56951
dataset_size: 110580
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
statsmind/llama-factory | ---
license: apache-2.0
---
|
turkish_ner | ---
annotations_creators:
- machine-generated
language_creators:
- expert-generated
language:
- tr
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: TurkishNer
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: domain
dtype:
class_label:
names:
'0': architecture
'1': basketball
'2': book
'3': business
'4': education
'5': fictional_universe
'6': film
'7': food
'8': geography
'9': government
'10': law
'11': location
'12': military
'13': music
'14': opera
'15': organization
'16': people
'17': religion
'18': royalty
'19': soccer
'20': sports
'21': theater
'22': time
'23': travel
'24': tv
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PERSON
'2': I-PERSON
'3': B-ORGANIZATION
'4': I-ORGANIZATION
'5': B-LOCATION
'6': I-LOCATION
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 177658278
num_examples: 532629
download_size: 204393976
dataset_size: 177658278
---
# Dataset Card for turkish_ner
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://arxiv.org/abs/1702.02363
- **Repository:** [Needs More Information]
- **Paper:** http://arxiv.org/abs/1702.02363
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** erayyildiz@ktu.edu.tr
### Dataset Summary
Automatically annotated Turkish corpus for named entity recognition and text categorization using large-scale gazetteers. The constructed gazetteers contains approximately 300K entities with thousands of fine-grained entity types under 25 different domains.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
Turkish
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
There's only the training set.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
H. Bahadir Sahin, Caglar Tirkaz, Eray Yildiz, Mustafa Tolga Eren and Omer Ozan Sonmez
### Licensing Information
Creative Commons Attribution 4.0 International
### Citation Information
@InProceedings@article{DBLP:journals/corr/SahinTYES17,
author = {H. Bahadir Sahin and
Caglar Tirkaz and
Eray Yildiz and
Mustafa Tolga Eren and
Omer Ozan Sonmez},
title = {Automatically Annotated Turkish Corpus for Named Entity Recognition
and Text Categorization using Large-Scale Gazetteers},
journal = {CoRR},
volume = {abs/1702.02363},
year = {2017},
url = {http://arxiv.org/abs/1702.02363},
archivePrefix = {arXiv},
eprint = {1702.02363},
timestamp = {Mon, 13 Aug 2018 16:46:36 +0200},
biburl = {https://dblp.org/rec/journals/corr/SahinTYES17.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
### Contributions
Thanks to [@merveenoyan](https://github.com/merveenoyan) for adding this dataset. |
zpn/pcba_686978 | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: pcba_686978
size_categories:
- 100K<n<1M
source_datasets: []
tags:
- bio
- bio-chem
- molnet
- molecule-net
- biophysics
task_categories:
- other
task_ids: []
---
# Dataset Card for pcba_686978
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: https://moleculenet.org/**
- **Repository: https://github.com/deepchem/deepchem/tree/master**
- **Paper: https://arxiv.org/abs/1703.00564**
### Dataset Summary
`pcba_686978` is a dataset included in [MoleculeNet](https://moleculenet.org/). PubChem BioAssay (PCBA) is a database consisting of biological activities of small molecules generated by high-throughput screening. We have chosen one of the larger tasks (ID 686978) as described in https://par.nsf.gov/servlets/purl/10168888.
## Dataset Structure
### Data Fields
Each split contains
* `smiles`: the [SMILES](https://en.wikipedia.org/wiki/Simplified_molecular-input_line-entry_system) representation of a molecule
* `selfies`: the [SELFIES](https://github.com/aspuru-guzik-group/selfies) representation of a molecule
* `target`: Measured results (Active/Inactive) for bioassays
### Data Splits
The dataset is split into an 80/10/10 train/valid/test split using random split.
### Source Data
#### Initial Data Collection and Normalization
Data was originially generated by the Pande Group at Standford
### Licensing Information
This dataset was originally released under an MIT license
### Citation Information
```
@misc{https://doi.org/10.48550/arxiv.1703.00564,
doi = {10.48550/ARXIV.1703.00564},
url = {https://arxiv.org/abs/1703.00564},
author = {Wu, Zhenqin and Ramsundar, Bharath and Feinberg, Evan N. and Gomes, Joseph and Geniesse, Caleb and Pappu, Aneesh S. and Leswing, Karl and Pande, Vijay},
keywords = {Machine Learning (cs.LG), Chemical Physics (physics.chem-ph), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Physical sciences, FOS: Physical sciences},
title = {MoleculeNet: A Benchmark for Molecular Machine Learning},
publisher = {arXiv},
year = {2017},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
Thanks to [@zanussbaum](https://github.com/zanussbaum) for adding this dataset.
|
alikli/code | ---
license: apache-2.0
---
|
inoid/SpanishMedicaLLM | ---
license: cc-by-2.0
---
|
jlbaker361/flickr_humans_dim_128_20k_vangogh | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 693719405.0
num_examples: 20000
download_size: 693447027
dataset_size: 693719405.0
---
# Dataset Card for "flickr_humans_dim_128_20k_vangogh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_236 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1220045756.0
num_examples: 237733
download_size: 1247468443
dataset_size: 1220045756.0
---
# Dataset Card for "chunk_236"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oliverjthomas2000/test | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 111
num_examples: 5
download_size: 1213
dataset_size: 111
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pmc/open_access | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- en
license:
- cc0-1.0
- cc-by-4.0
- cc-by-sa-4.0
- cc-by-nd-4.0
- cc-by-nc-4.0
- cc-by-nc-sa-4.0
- cc-by-nc-nd-4.0
- other
- unknown
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
pretty_name: PMC Open Access
---
# Dataset Card for PMC Open Access Subset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.ncbi.nlm.nih.gov/pmc/tools/openftlist/
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [PubMed Central](mailto:pubmedcentral@ncbi.nlm.nih.gov)
### Dataset Summary
The PMC Open Access Subset includes more than 3.4 million journal articles and preprints that are made available under
license terms that allow reuse. Not all articles in PMC are available for text mining and other reuse, many have
copyright protection, however articles in the PMC Open Access Subset are made available under Creative Commons or
similar licenses that generally allow more liberal redistribution and reuse than a traditional copyrighted work. The
PMC Open Access Subset is one part of the PMC Article Datasets.
Within the PMC Open Access Subset, there are three groupings:
- Commercial Use Allowed - CC0, CC BY, CC BY-SA, CC BY-ND licenses
- Non-Commercial Use Only - CC BY-NC, CC BY-NC-SA, CC BY-NC-ND licenses; and
- Other - no machine-readable Creative Commons license, no license, or a custom license.
### Supported Tasks and Leaderboards
- Language modeling
### Languages
English (`en`).
## Dataset Structure
### Data Instances
```
{
'text': "==== Front\nPLoS BiolPLoS BiolpbioplosbiolPLoS Biology1544-91731545-7885Public Library of Science San Francisco, USA 10.1371/journal.pbio.0000005Research ArticleGenetics/Genomics/Gene TherapyInfectious DiseasesMicrobiologyPlasmodiumThe Transcriptome of the Intraerythrocytic Developmental Cycle of Plasmodium falciparum\n P. falciparum IDC TranscriptomeBozdech Zbynek \n1\nLlinás Manuel \n1\nPulliam Brian Lee \n1\nWong Edith D \n1\nZhu Jingchun \n2\nDeRisi Joseph L joe@derisilab.ucsf.edu\n1\n1Department of Biochemistry and Biophysics, University of California, San FranciscoSan Francisco, CaliforniaUnited States of America2Department of Biological and Medical Informatics, University of California, San FranciscoSan Francisco, CaliforniaUnited States of America10 2003 18 8 2003 18 8 2003 1 1 e512 6 2003 25 7 2003 Copyright: ©2003 Bozdech et al.2003This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.\nMicroarray Analysis: Genome-Scale Hypothesis Scanning \n\nMonitoring Malaria: Genomic Activity of the Parasite in Human Blood Cells \n\nPlasmodium falciparum is the causative agent of the most burdensome form of human malaria, affecting 200–300 million individuals per year worldwide. The recently sequenced genome of P. falciparum revealed over 5,400 genes, of which 60% encode proteins of unknown function. Insights into the biochemical function and regulation of these genes will provide the foundation for future drug and vaccine development efforts toward eradication of this disease. By analyzing the complete asexual intraerythrocytic developmental cycle (IDC) transcriptome of the HB3 strain of P. falciparum, we demonstrate that at least 60% of the genome is transcriptionally active during this stage. Our data demonstrate that this parasite has evolved an extremely specialized mode of transcriptional regulation that produces a continuous cascade of gene expression, beginning with genes corresponding to general cellular processes, such as protein synthesis, and ending with Plasmodium-specific functionalities, such as genes involved in erythrocyte invasion. The data reveal that genes contiguous along the chromosomes are rarely coregulated, while transcription from the plastid genome is highly coregulated and likely polycistronic. Comparative genomic hybridization between HB3 and the reference genome strain (3D7) was used to distinguish between genes not expressed during the IDC and genes not detected because of possible sequence variations...
'pmid': '12929205',
'accession_id': 'PMC176545',
'license': 'CC BY',
'last_updated': '2021-01-05 08:21:03',
'retracted': 'no',
'citation': 'PLoS Biol. 2003 Oct 18; 1(1):e5'
}
```
### Data Fields
- `text`: Text content.
- `pmid`: PubMed ID.
- `accession_id`: Unique identifier for a sequence record.
- `license`: License type.
- `last_updated`: Date of last update.
- `retracted`: Whether retracted or not.
- `citation`: Citation reference.
### Data Splits
The dataset is not split.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
License terms vary. Please refer to the license statement in each article for specific terms of use.
Within the PMC Open Access Subset, there are three groupings based on available license terms:
- Commercial Use Allowed - CC0, CC BY, CC BY-SA, CC BY-ND licenses;
- Non-Commercial Use Only - CC BY-NC, CC BY-NC-SA, CC BY-NC-ND licenses; and
- Other - no machine-readable Creative Commons license, no license, or a custom license.
### Citation Information
```
PMC Open Access Subset [Internet]. Bethesda (MD): National Library of Medicine. 2003 - [cited YEAR MONTH DAY]. Available from https://www.ncbi.nlm.nih.gov/pmc/tools/openftlist/
```
### Contributions
Thanks to [@albertvillanova](https://github.com/albertvillanova) for adding this dataset.
|
artmobile/test | ---
license: mit
---
|
weitianwen/cmath | ---
license: cc-by-4.0
language:
- zh
tags:
- mathematics
size_categories:
- 1K<n<10K
---
# CMATH
## Introduction
We present the Chinese Elementary School Math Word Problems (CMATH) dataset, comprising 1.7k elementary school-level math word problems with detailed annotations, source from actual Chinese workbooks and exams. This dataset aims to provide a benchmark tool for assessing the following question: to what grade level of elementary school math do the abilities of popular large language models (LLMs) correspond? We evaluate a variety of popular LLMs, including both commercial and open-source options, and discover that only GPT-4 achieves success (accuracy >= 60%) across all six elementary school grades, while other models falter at different grade levels.
Furthermore, we assess the robustness of LLMs by augmenting the original problems in the CMATH dataset with distracting information. Our findings reveal that GPT-4 is the sole model that maintains robustness, further distinguishing its performance from competing models. We anticipate that our CMATH dataset will expose limitations in LLMs' capabilities and promote their ongoing development and advancement.
## Datasets
### cmath_dev
Initial release of 600 examples from CMATH dataset, with 100 problems from each elementary school grade.
We will release the remaining portion of the dataset by the end of the year.
#### Examples and Annotations

#### Evaluation Results

### distractor
To assess the robustness of LLMs against "irrelevant" information, we manually created a small ``distractor dataset'' comprising 60 examples, 10 for each grade level. Each example consists of an original problem and five associated problems augmented with 1 ~ 5 piece(s) of irrelevant information which we refer to as distractor(s).
#### Examples

#### Evaluation Results

## Script
We provide a script `eval.py` that implements automated evaluation.
## License
CC BY 4.0
## Citation
```
@misc{wei2023cmath,
title={CMATH: Can Your Language Model Pass Chinese Elementary School Math Test?},
author={Tianwen Wei and Jian Luan and Wei Liu and Shuang Dong and Bin Wang},
year={2023},
eprint={2306.16636},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
Visit our git [repository](https://github.com/XiaoMi/cmath) for more details.
You may also read our [paper](https://arxiv.org/abs/2306.16636). |
p208p2002/csl-1.8G | ---
configs:
- config_name: default
data_files:
- split: train
path: csl.jsonl
language:
- zh
---
# CSL 中文科學論文摘要資料集
資料來源: https://github.com/ydli-ai/CSL |
Divya1287/llama2 | ---
license: openrail
task_categories:
- text-generation
- conversational
- question-answering
language:
- en
pretty_name: prompt
size_categories:
- 1K<n<10K
--- |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.