datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
gart-labor/eclassCorpus | ---
dataset_info:
features:
- name: did
dtype: int64
- name: query
dtype: string
- name: name
dtype: string
- name: datatype
dtype: string
- name: unit
dtype: string
- name: IRDI
dtype: string
- name: metalabel
dtype: int64
splits:
- name: train
num_bytes: 137123
num_examples: 672
download_size: 0
dataset_size: 137123
task_categories:
- sentence-similarity
language:
- en
size_categories:
- n<1K
---
# Dataset Card for "eclassCorpus"
This Dataset consists of names and descriptions from ECLASS-standard pump-properties. It can be used to evaluate models on the task of matching paraphrases to the ECLASS-standard pump-properties based on their semantics. |
text-machine-lab/constrained_language | ---
dataset_info:
features:
- name: TEXT
dtype: string
splits:
- name: train
num_bytes: 4537675604
num_examples: 9081490
- name: validation
num_bytes: 50107745
num_examples: 100000
- name: test
num_bytes: 50134861
num_examples: 100000
download_size: 3052451421
dataset_size: 4637918210
---
# Dataset Card for constrained_language (pre-training data for simplified English)
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Citation Information](#additional-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Paper: https://arxiv.org/abs/2305.17266**
- **Point of Contact: vijeta_deshpande@student.uml.edu**
### Dataset Summary
This dataset is one of the two datasets published by "Honey, I Shrunk the Language: Language Model Behavior at Reduced Scale" (https://arxiv.org/abs/2305.17266).
The dataset available at this link is the pre-training data constrained by vocabulary. The other published data i.e. the pre-training data that is not constrained by vocabulary is available at https://huggingface.co/datasets/text-machine-lab/unconstrained_language.
The vocabulary used for curating the data is constructed from the AOChildes corpus (https://www.sciencedirect.com/science/article/abs/pii/S0079742121000256). The AOChildes corpus consists of transcripts of child-directed speech. Hence, the vocabulary constructed from AOChildes corpus consists of words spoken or heard by children of age six years or younger.
The vocabulary is then used to filter the widely used text corpora,
- C4: https://arxiv.org/abs/1910.10683,
- BookCorpus: https://ieeexplore.ieee.org/document/7410368,
- Wikipedia: https://huggingface.co/datasets/wikipedia,
- Simplified-Wikipedia: https://simple.wikipedia.org/wiki/Main_Page,
- Children's Book Test Corpus: https://arxiv.org/abs/1511.02301
From the above corpora, only those spans are included that contain words only from the predefined vocabulary. The dataset includes 44 million sentences (~6 million sequences, each with ~128 tokens) and 3 million contiguous spans (each with ~128 tokens). Refer to Table 1 of the paper for data distribution over different corpora.
### Languages
The dataset contains the English language only.
## Dataset Structure
The dataset is available in the arrow dataset format with three splits namely, train, validation, and test. Every data instance has only one key "Text" that included a text span of approximately 128 tokens.
### Citation Information
If this dataset is useful to you please cite our work.
```sh
@article{deshpande2023honey,
title={Honey, I Shrunk the Language: Language Model Behavior at Reduced Scale},
author={Deshpande, Vijeta and Pechi, Dan and Thatte, Shree and Lialin, Vladislav and Rumshisky, Anna},
journal={arXiv preprint arXiv:2305.17266},
year={2023}
}
```
|
james-burton/OrientalMuseum_min5-mat | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: object_name
dtype: string
- name: other_name
dtype: string
- name: label
dtype:
class_label:
names:
'0': Animal Mummy
'1': Batik
'2': Buffalo Horn
'3': Chinese Red Rosewood
'4': Colour on Paper
'5': Flint/Chert
'6': Gouache on Paper
'7': Haematite/Red Ochre
'8': Human Bone
'9': Ink and Colour on Paper
'10': Ink and Colours on Silk
'11': Ink and Opaque Watercolour on Paper
'12': Ink on Paper
'13': Jade (Calcified)
'14': Japanese paper
'15': Microcline/Green Feldspar/Amazon-Stone
'16': Nile Mud
'17': Opaque Watercolour on Paper
'18': Opaque Watercolour or Gouache on Mica
'19': Pith
'20': Pith Paper
'21': Plant Product
'22': Resin/Plastic
'23': Rhinoceros Horn
'24': Smaragdite
'25': Steatite
'26': Steatite/Soap Stone
'27': Watercolour on Rice Paper
'28': acrylic
'29': agate
'30': alabaster
'31': aluminum
'32': amber
'33': amethyst
'34': antler
'35': artificial stone
'36': balsa
'37': bamboo
'38': basalt
'39': bone
'40': bowenite
'41': boxwood
'42': brass
'43': brocade
'44': bronze
'45': burnt jade
'46': canvas
'47': cardboard
'48': cards
'49': carnelian
'50': cast iron
'51': celadon
'52': cellulose acetate
'53': ceramic
'54': chalcedony
'55': cherry
'56': clay
'57': cloth
'58': coconut
'59': copper
'60': copper alloy
'61': coral
'62': cotton
'63': crystal
'64': diorite
'65': dolerite
'66': earthenware
'67': ebony
'68': emerald
'69': enamel
'70': faience
'71': felt
'72': flax
'73': flint
'74': gauze
'75': glass
'76': gold
'77': granite
'78': gray ware
'79': hardwood
'80': horn
'81': incense
'82': ink
'83': iron
'84': ivory
'85': jade
'86': jadeite
'87': jasper
'88': lacquer
'89': lapis lazuli
'90': lazurite
'91': lead
'92': lead alloy
'93': leather
'94': limestone
'95': linen
'96': malachite
'97': marble
'98': metal
'99': mineral
'100': mother of pearl
'101': muslin
'102': nephrite
'103': nylon
'104': obsidian
'105': organic material
'106': paint
'107': palm fiber
'108': palm leaf
'109': paper
'110': papier mâché
'111': papyrus
'112': pewter
'113': photographic paper
'114': pine
'115': plant fiber
'116': plaster
'117': plastic
'118': plate
'119': polyester
'120': polystyrene
'121': porcelain
'122': pottery
'123': quartzite
'124': rattan
'125': realgar
'126': reed
'127': rice paper
'128': rock
'129': rush
'130': sandstone
'131': satin
'132': schist
'133': seashell
'134': serpentine
'135': shell
'136': silk
'137': siltstone
'138': silver
'139': skull
'140': slate
'141': soapstone
'142': softwood
'143': stalagmites
'144': steel
'145': stone
'146': stoneware
'147': straw
'148': stucco
'149': sycamore
'150': synthetic fiber
'151': teak
'152': terracotta
'153': textiles
'154': tin
'155': tortoise shell
'156': tourmaline
'157': travertine
'158': tremolite
'159': turquoise
'160': velvet
'161': wood
'162': wool
'163': wrought iron
'164': zinc alloy
- name: production.period
dtype: string
- name: production.place
dtype: string
splits:
- name: train
num_bytes: 3150369309.880859
num_examples: 23060
- name: validation
num_bytes: 685257063.8715706
num_examples: 5426
- name: test
num_bytes: 535025459.36357063
num_examples: 5426
download_size: 3911528513
dataset_size: 4370651833.116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
distilabel-internal-testing/ohp-writing | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates_completions
sequence: string
- name: candidate_policies
sequence: string
- name: ranks
sequence: int64
- name: rank_str
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 11230559.91159082
num_examples: 1530
download_size: 14404258
dataset_size: 11230559.91159082
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lorotanida2/gui | ---
license: openrail
---
|
ai4bharat/IndicWikiBio-Translated | ---
dataset_info:
features:
- name: id
dtype: string
- name: infobox
dtype: string
- name: serialized_infobox
dtype: string
- name: summary
dtype: string
- name: itv2 hi infobox
dtype: string
- name: itv2 hi summary
dtype: string
splits:
- name: test
num_bytes: 7683659
num_examples: 1919
- name: validation
num_bytes: 7046869
num_examples: 1853
download_size: 5616013
dataset_size: 14730528
---
# Dataset Card for "indic-wikibio-hi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_their_them | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 147353
num_examples: 590
- name: dev_mismatched
num_bytes: 210325
num_examples: 759
- name: test_matched
num_bytes: 134072
num_examples: 521
- name: test_mismatched
num_bytes: 201352
num_examples: 767
- name: train
num_bytes: 6014571
num_examples: 22928
download_size: 4074860
dataset_size: 6707673
---
# Dataset Card for "MULTI_VALUE_mnli_their_them"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Curvature/Test | ---
license: gpl
---
|
irds/lotte_lifestyle_dev | ---
pretty_name: '`lotte/lifestyle/dev`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/lifestyle/dev`
The `lotte/lifestyle/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/lifestyle/dev).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=268,893
This dataset is used by: [`lotte_lifestyle_dev_forum`](https://huggingface.co/datasets/irds/lotte_lifestyle_dev_forum), [`lotte_lifestyle_dev_search`](https://huggingface.co/datasets/irds/lotte_lifestyle_dev_search)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/lotte_lifestyle_dev', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
youyu0105/llm-MIDI | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 50994814
num_examples: 14606
download_size: 12039871
dataset_size: 50994814
---
# Dataset Card for "llm-MIDI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Naveengo/codeparrot_10000_rows | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: repo_name
dtype: string
- name: path
dtype: string
- name: copies
dtype: string
- name: size
dtype: string
- name: content
dtype: string
- name: license
dtype: string
splits:
- name: train
num_bytes: 130556998.1704905
num_examples: 10000
- name: valid
num_bytes: 6658657.886815172
num_examples: 500
download_size: 52539728
dataset_size: 137215656.05730566
---
# Dataset Card for "codeparrot_10000_rows"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
grimulkan/physical-reasoning | ---
license: unknown
---
Q&A testing physical reasoning, in Alpaca format, generated by `gpt-4-1106-preview`. OpenAI terms apply.
Each answer was double-checked by `gpt-4-1106-preview`, and suspicious answers were removed, since even GPT4 struggles with accuracy in this test. This does not guarantee that the remaining entries are correct, but the accuracy should be better than base.
The types of questions ranged from line of sight problems (who/what is visible from where in various situations), temperature-related questions, pressure-related questions, gravitational effects, etc.
**Files:**
- `physical_reasoning.json` Double-checked physical reasoning questions based on the natural world (500 entries)
- `physical_reasoning_longer.json` Slightly longer Q&A (149 entries)
- `physical_reasoning_magic.json` Same, but assigns magical properties to the world and tests the resulting reasoning (egs., imagine a mirror that only shows the reflection of what happened 10 seconds ago...) (250 entries)
|
open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_2 | ---
pretty_name: Evaluation run of ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_2](https://huggingface.co/ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T21:55:55.370751](https://huggingface.co/datasets/open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_2/blob/main/results_2024-04-08T21-55-55.370751.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.600378533051076,\n\
\ \"acc_stderr\": 0.03312885011064985,\n \"acc_norm\": 0.6071013388677541,\n\
\ \"acc_norm_stderr\": 0.033832691009625576,\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5443868003824883,\n\
\ \"mc2_stderr\": 0.01579429140543887\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.01429122839353659,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142825\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6600278828918542,\n\
\ \"acc_stderr\": 0.00472731244889284,\n \"acc_norm\": 0.8521210914160526,\n\
\ \"acc_norm_stderr\": 0.003542544319405141\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099583,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099583\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396265,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396265\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400492,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400492\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885203,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885203\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397457,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397457\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478466,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688214,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n\
\ \"acc_stderr\": 0.012612974369390975,\n \"acc_norm\": 0.4217731421121252,\n\
\ \"acc_norm_stderr\": 0.012612974369390975\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6258169934640523,\n \"acc_stderr\": 0.019576953122088833,\n \
\ \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.019576953122088833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.046313813194254656,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.046313813194254656\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872475,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072766,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072766\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5443868003824883,\n\
\ \"mc2_stderr\": 0.01579429140543887\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773237\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2395754359363154,\n \
\ \"acc_stderr\": 0.01175686434407741\n }\n}\n```"
repo_url: https://huggingface.co/ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|arc:challenge|25_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|gsm8k|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hellaswag|10_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T21-55-55.370751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T21-55-55.370751.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- '**/details_harness|winogrande|5_2024-04-08T21-55-55.370751.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T21-55-55.370751.parquet'
- config_name: results
data_files:
- split: 2024_04_08T21_55_55.370751
path:
- results_2024-04-08T21-55-55.370751.parquet
- split: latest
path:
- results_2024-04-08T21-55-55.370751.parquet
---
# Dataset Card for Evaluation run of ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_2](https://huggingface.co/ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T21:55:55.370751](https://huggingface.co/datasets/open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_2/blob/main/results_2024-04-08T21-55-55.370751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.600378533051076,
"acc_stderr": 0.03312885011064985,
"acc_norm": 0.6071013388677541,
"acc_norm_stderr": 0.033832691009625576,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5443868003824883,
"mc2_stderr": 0.01579429140543887
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.01429122839353659,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142825
},
"harness|hellaswag|10": {
"acc": 0.6600278828918542,
"acc_stderr": 0.00472731244889284,
"acc_norm": 0.8521210914160526,
"acc_norm_stderr": 0.003542544319405141
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099583,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099583
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396265,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396265
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400492,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400492
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885203,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885203
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397457,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397457
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.01414397027665757,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.01414397027665757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688214,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.012612974369390975,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.012612974369390975
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6258169934640523,
"acc_stderr": 0.019576953122088833,
"acc_norm": 0.6258169934640523,
"acc_norm_stderr": 0.019576953122088833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.046313813194254656,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.046313813194254656
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872475,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072766,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072766
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5443868003824883,
"mc2_stderr": 0.01579429140543887
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773237
},
"harness|gsm8k|5": {
"acc": 0.2395754359363154,
"acc_stderr": 0.01175686434407741
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
xzuyn/ethics-utilitarianism-dpo | ---
language:
- en
size_categories:
- 10K<n<100K
tags:
- human-feedback
- comparison
- rlhf
- dpo
- preference
- pairwise
---
[ETHICS Github](https://github.com/hendrycks/ethics)
Dataset has been modified to have a random starting message to ask for a preferred scenario. It also has randomized usage of digits and words to indicate the scenario. I wanted to vary the quality of the instruction so that its similar to what a user may type, so I made the scenario prefixes able to be different from one another.
The order in which the scenarios are presented is randomized.
```
Selected Prefix: 'Choose the scenario which is better than the other.' Count: 1970
Selected Prefix: 'Which one is better?' Count: 1969
Selected Prefix: 'Please let me know what one of these you think is more ideal.' Count: 1966
Selected Prefix: 'Pick the more ideal situation.' Count: 1926
Selected Prefix: 'What scenario is better to you?' Count: 1901
Selected Prefix: 'What do you think is a better option?' Count: 2024
Selected Prefix: 'The following is two scenarios. Select which is better.' Count: 1982
Selected Scenario Prefix: 'scenario ' Count: 1744
Selected Scenario Prefix: 'Option ' Count: 1753
Selected Scenario Prefix: 'Choice ' Count: 1730
Selected Scenario Prefix: 'Situation ' Count: 1742
Selected Scenario Prefix: 'situation ' Count: 1705
Selected Scenario Prefix: 'choice ' Count: 1721
Selected Scenario Prefix: 'option ' Count: 1682
Selected Scenario Prefix: 'Scenario ' Count: 1661
Selected Scenario Prefix Number 1: '1: ' Count: 4586
Selected Scenario Prefix Number 1: 'One: ' Count: 4572
Selected Scenario Prefix Number 1: 'one: ' Count: 4580
Selected Scenario Prefix Number 2: '2: ' Count: 4502
Selected Scenario Prefix Number 2: 'two: ' Count: 4670
Selected Scenario Prefix Number 2: 'Two: ' Count: 4566
```
# Paper: [Aligning AI With Shared Human Values](https://arxiv.org/pdf/2008.02275)
```
@article{hendrycks2021ethics,
title={Aligning AI With Shared Human Values},
author={Dan Hendrycks and Collin Burns and Steven Basart and Andrew Critch and Jerry Li and Dawn Song and Jacob Steinhardt},
journal={Proceedings of the International Conference on Learning Representations (ICLR)},
year={2021}
}
``` |
Jonnyck/myself | ---
license: other
---
|
distilled-from-one-sec-cv12/chunk_41 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1280731656
num_examples: 249558
download_size: 1302386764
dataset_size: 1280731656
---
# Dataset Card for "chunk_41"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eming/test | ---
license: mit
dataset_info:
features:
- name: a
dtype: int64
splits:
- name: train
num_bytes: 48.0
num_examples: 6
- name: validation
num_bytes: 16.0
num_examples: 2
- name: test
num_bytes: 16.0
num_examples: 2
download_size: 2507
dataset_size: 80.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
BangumiBase/shokeishoujonovirginroad | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Shokei Shoujo No Virgin Road
This is the image base of bangumi Shokei Shoujo no Virgin Road, we detected 18 characters, 1105 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 7 | [Download](0/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 1 | 19 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 24 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 11 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 30 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 10 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 37 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 228 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 30 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 44 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 76 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 34 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 25 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 49 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 266 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 79 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 6 | [Download](16/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 130 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
freshpearYoon/vr_train_free_48 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 7020373217
num_examples: 10000
download_size: 1127712923
dataset_size: 7020373217
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_remove_det_indefinite | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4339
num_examples: 21
- name: test
num_bytes: 17537
num_examples: 64
- name: train
num_bytes: 31175
num_examples: 157
download_size: 26849
dataset_size: 53051
---
# Dataset Card for "MULTI_VALUE_wnli_remove_det_indefinite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_come_future | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3777
num_examples: 23
- name: test
num_bytes: 7250
num_examples: 48
- name: train
num_bytes: 100650
num_examples: 737
download_size: 51376
dataset_size: 111677
---
# Dataset Card for "MULTI_VALUE_sst2_come_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/hh-rlhf_with_features_flan_t5_large | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: human
dtype: string
- name: assistant_chosen
dtype: string
- name: assistant_rejected
dtype: string
- name: log_score_chosen
dtype: float64
- name: log_score_rejected
dtype: float64
- name: labels
dtype: string
splits:
- name: train
num_bytes: 14434424
num_examples: 9574
- name: test
num_bytes: 14378349
num_examples: 9574
download_size: 15748504
dataset_size: 28812773
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Back-up/ds2k | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: perplexity
dtype: float64
- name: num_char
dtype: string
- name: num_word
dtype: string
splits:
- name: train
num_bytes: 276071533.4166619
num_examples: 11544
download_size: 80838433
dataset_size: 276071533.4166619
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
language-and-voice-lab/althingi_asr | ---
annotations_creators:
- machine-generated
language:
- is
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Althingi Parliamentary Speech
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- icelandic
- parliamentary speech
- parlament
- althingi
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for althingi_asr
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Data](#data)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Other Known Limitations](#other-known-limitations)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** Althingi Parliamentary Speech
- **Repository:** [LDC](https://catalog.ldc.upenn.edu/LDC2021S01)
- **Paper:** [Building an ASR corpus using Althingi’s Parliamentary Speeches](https://www.researchgate.net/profile/Jon-Gudnason/publication/319185185_Building_an_ASR_Corpus_Using_Althingi's_Parliamentary_Speeches/links/5d1dbdd3a6fdcc2462bdda0f/Building-an-ASR-Corpus-Using-Althingis-Parliamentary-Speeches.pdf)
- **Point of Contact:** [Jón Guðnason](mailto:jg@ru.is)
### Dataset Summary
Althingi Parliamentary Speech consists of approximately 542 hours of recorded speech from Althingi, the Icelandic Parliament, along with corresponding transcripts, a pronunciation dictionary and two language models. Speeches date from 2005-2016.
This dataset was collected in 2016 by the ASR for Althingi project at [Reykjavik University](https://en.ru.is/) in collaboration with the Althingi speech department. The purpose of that project was to develop an ASR (automatic speech recognition) system for parliamentary speech to replace the procedure of manually transcribing performed speeches.
### Data
The mean speech length is six minutes, with speeches ranging from under one minute to around thirty minutes. The corpus features 197 speakers (105 male, 92 female) and is split into training, development and evaluation sets. The language models are of two types: a pruned trigram model, used in decoding, and an unpruned constant ARPA 5-gram model, used for re-scoring decoding results.
Audio data is presented as single channel 16-bit mp3 files; the majority of these files have a sample rate of 44.1 kHz. Transcripts and other text data are plain text encoded in UTF-8.
### Example Usage
The Althingi Corpus is divided in 3 splits: train, validation and test. To load a specific split pass its name as a config name:
```python
from datasets import load_dataset
althingi_asr = load_dataset("language-and-voice-lab/althingi_asr")
```
To load an specific split (for example, the validation split) do:
```python
from datasets import load_dataset
althingi_asr = load_dataset("language-and-voice-lab/althingi_asr",split="validation")
```
### Supported Tasks
automatic-speech-recognition: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### Languages
The audio is in Icelandic.
## Dataset Structure
### Data Instances
```python
{
'audio_id': 'rad20160602T000219_00083',
'audio': {
'path': '/home/inga/.cache/HuggingFace/datasets/downloads/extracted/52607f9db9e3394263070575d29323213b99a06a996c43d4fe75bca115827d12/dev/EyH/rad20160602T000219/rad20160602T000219_00083.flac',
'array': array([-0.01098633, -0.01489258, -0.01040649, ..., 0.00314331,
0.00186157, 0.00527954], dtype=float32),
'sampling_rate': 16000
},
'speaker_id': 'rad20160602T000219',
'duration': 12.67199993133545,
'normalized_text': 'og má svo sannarlega segja að landslagið sé nokkuð breytt frá því þrjú komma tvö prósent þjóðarinnar töldust vera innflytjendur árið tvö þúsund en nú teljast tíu prósent þjóðarinnar vera fyrsta og önnur kynslóð innflytjenda'
}
```
### Data Fields
* `audio_id` (string) - id of audio segment
* `audio` (datasets.Audio) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate. In non-streaming mode (default), the path points to the locally extracted audio. In streaming mode, the path is the relative path of an audio inside its archive (as files are not downloaded and extracted locally).
* `speaker_id` (string) - id of speaker
* `duration` (float32) - duration of the audio file in seconds.
* `normalized_text` (string) - normalized audio segment transcription.
### Data Splits
The corpus is split into train, evaluation, and test portions. Lenghts of every portion are: train = 514h29m, test = 13h52m, evaluation=14h02m.
To load an specific portion please see the above section "Example Usage".
## Additional Information
### Other Known Limitations
"Althingi Parliamentary Speech" by the Language and Voice Laboratory (LVL) at the Reykjavik University is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) License with the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
### Licensing Information
[CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@misc{helgadottiralthingi2021,
title={Althingi Parliamentary Speech},
ldc_catalog_no={LDC2021S01},
DOI={https://doi.org/10.35111/695b-6697},
author={Helgadóttir, Inga Rún and Kjaran, Róbert and Nikulásdóttir, Anna Björk and Guðnason, Jón},
publisher={Reykjavík University}
journal={Linguistic Data Consortium, Philadelphia},
year={2021},
url={https://catalog.ldc.upenn.edu/LDC2021S01},
}
```
### Contributions
This project was made possible through the support of Althingi’s information and publications departments. The authors would like to thank Solveig K. Jónsdóttir, Þorbjörg Árnadóttir and Ingvi Stígsson for their valuable help.
|
doceoSoftware/docvqa_clicars_permiscirculacio_Mireia_180_2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: query
sequence: string
- name: answers
sequence: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 9290180.0
num_examples: 180
- name: test
num_bytes: 41232.0
num_examples: 1
download_size: 9030806
dataset_size: 9331412.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
jlbaker361/flickr_humans_dim_128_50k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 1466320030.0
num_examples: 50000
download_size: 1464372542
dataset_size: 1466320030.0
---
# Dataset Card for "flickr_humans_dim_128_50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DreadPoor__iWillChangeTheNameLater | ---
pretty_name: Evaluation run of DreadPoor/iWillChangeTheNameLater
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/iWillChangeTheNameLater](https://huggingface.co/DreadPoor/iWillChangeTheNameLater)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__iWillChangeTheNameLater\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T05:02:41.760263](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__iWillChangeTheNameLater/blob/main/results_2024-02-21T05-02-41.760263.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6551490059765831,\n\
\ \"acc_stderr\": 0.031986433271542374,\n \"acc_norm\": 0.6547881343259481,\n\
\ \"acc_norm_stderr\": 0.032652484132244745,\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6940561007329668,\n\
\ \"mc2_stderr\": 0.014993134152307568\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.013449522109932485,\n\
\ \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.013119040897725922\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.711113324039036,\n\
\ \"acc_stderr\": 0.004523188431142894,\n \"acc_norm\": 0.8822943636725752,\n\
\ \"acc_norm_stderr\": 0.0032160063577603747\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569526,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569526\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066298,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066298\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179588,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781752,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6940561007329668,\n\
\ \"mc2_stderr\": 0.014993134152307568\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \
\ \"acc_stderr\": 0.012791037227336034\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/iWillChangeTheNameLater
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|arc:challenge|25_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|gsm8k|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hellaswag|10_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T05-02-41.760263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T05-02-41.760263.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- '**/details_harness|winogrande|5_2024-02-21T05-02-41.760263.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T05-02-41.760263.parquet'
- config_name: results
data_files:
- split: 2024_02_21T05_02_41.760263
path:
- results_2024-02-21T05-02-41.760263.parquet
- split: latest
path:
- results_2024-02-21T05-02-41.760263.parquet
---
# Dataset Card for Evaluation run of DreadPoor/iWillChangeTheNameLater
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/iWillChangeTheNameLater](https://huggingface.co/DreadPoor/iWillChangeTheNameLater) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__iWillChangeTheNameLater",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T05:02:41.760263](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__iWillChangeTheNameLater/blob/main/results_2024-02-21T05-02-41.760263.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6551490059765831,
"acc_stderr": 0.031986433271542374,
"acc_norm": 0.6547881343259481,
"acc_norm_stderr": 0.032652484132244745,
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6940561007329668,
"mc2_stderr": 0.014993134152307568
},
"harness|arc:challenge|25": {
"acc": 0.6953924914675768,
"acc_stderr": 0.013449522109932485,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.013119040897725922
},
"harness|hellaswag|10": {
"acc": 0.711113324039036,
"acc_stderr": 0.004523188431142894,
"acc_norm": 0.8822943636725752,
"acc_norm_stderr": 0.0032160063577603747
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569526,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569526
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092448,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092448
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066298,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066298
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179588,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781752,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6940561007329668,
"mc2_stderr": 0.014993134152307568
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.012791037227336034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
beyond/rlhf-reward-single-round-trans_chinese | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 12139022
num_examples: 19862
- name: test
num_bytes: 3117841
num_examples: 4996
download_size: 10699367
dataset_size: 15256863
---
# Dataset Card for "rlhf-reward-single-round-trans_chinese"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pfh1976/missionGenPFH-dataset | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: mission
dtype: string
splits:
- name: train
num_bytes: 4729
num_examples: 30
download_size: 5650
dataset_size: 4729
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "missionGenPFH-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Llamas-competition/public_test_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
splits:
- name: train
num_bytes: 5572494.357142856
num_examples: 177
download_size: 5341728
dataset_size: 5572494.357142856
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chathuranga-jayanath/context-5-rhino-finmath-times4j-html-mavendoxia-wro4j-guava-supercsv-len-20000-prompt-1 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: filepath
dtype: string
- name: start_bug_line
dtype: int64
- name: end_bug_line
dtype: int64
- name: bug
dtype: string
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 49042198
num_examples: 77473
- name: validation
num_bytes: 6148194
num_examples: 9684
- name: test
num_bytes: 6125074
num_examples: 9684
download_size: 25481980
dataset_size: 61315466
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
DBQ/Mr.Porter.Product.prices.Czech.Republic | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Czech Republic - Mr Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Mr Porter
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 9135031
num_examples: 27800
download_size: 2087225
dataset_size: 9135031
---
# Mr Porter web scraped data
## About the website
The **EMEA industry**, specifically in the **Czech Republic**, has seen significant growth in **ecommerce**. An increasing number of consumers are choosing to shop online due to its accessibility and convenience. A key player in this market is **Mr Porter**, an online retail platform specializing in mens luxury fashion. The collected dataset provides a deep insight into the **product-list page (PLP) data** of Mr Porter, offering valuable information regarding customer preference, market trends, and product popularity. Through this data, it is evident that the ecommerce sector in the Czech Republic is thriving with ample growth opportunities.
## Link to **dataset**
[Czech Republic - Mr Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Mr%20Porter%20Product-prices%20Czech%20Republic/r/recBcEw2pvdbDiMDl)
|
tyzhu/squad_qa_no_id_v5_full_recite_full_passage_first_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8870980.899370227
num_examples: 4778
- name: validation
num_bytes: 580390
num_examples: 300
download_size: 1748178
dataset_size: 9451370.899370227
---
# Dataset Card for "squad_qa_no_id_v5_full_recite_full_passage_first_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rizquuula/IndonesianFactHoaxPoliticalNews | ---
license: apache-2.0
---
|
cmalaviya/quest | ---
configs:
- config_name: main
data_files:
- split: train
path: train.jsonl
- split: test
path: test.jsonl
- split: validation
path: val.jsonl
license: apache-2.0
task_categories:
- text-retrieval
language:
- en
source_datasets:
- original
pretty_name: QUEST
annotations_creators:
- wikipedia-sourced
size_categories:
- 1K<n<10K
---
# Dataset Card for QUEST
## Dataset Description
- **Repository: https://github.com/google-research/language/tree/master/language/quest**
- **Paper: https://arxiv.org/abs/2305.11694**
- **Point of Contact: chaitanyamalaviya@gmail.com**
### Dataset Summary
We provide here the data accompanying the paper: [QUEST: A Retrieval Dataset of Entity-Seeking Queries with Implicit Set Operations
](https://arxiv.org/abs/2305.11694).
## Dataset Structure
### Data Instances
QUEST contains 6307 training queries, 323 examples for development, and 1727 examples for testing.
### Data Fields
Each examples file contains newline-separated json dictionaries with the following fields:
* `query` - Paraphrased query written by annotators.
* `docs` - List of relevant document titles.
* `original_query` - The original query which was paraphrased. Atomic queries are
enclosed by `<mark></mark>`. Augmented queries do not have this field populated.
* `scores` - This field is not populated and only used when producing predictions to enable sharing the same data structure.
* `metadata` - A dictionary with the following fields:
* `template` - The template used to create the query.
* `domain` - The domain to which the query belongs.
* `fluency` - List of fluency ratings for the query.
* `meaning` - List of ratings for whether the paraphrased query meaning is the
same as the original query.
* `naturalness` - List of naturalness ratings for the query.
* `relevance_ratings` - Dictionary mapping document titles to relevance ratings
for the document.
* `evidence_ratings` - Dictionary mapping document titles to evidence ratings
for the document.
* `attributions` - Dictionary mapping a document title to its attributions
attributions are a list of dictionaries mapping a query substring to a
document substring.
The document corpus is at https://storage.googleapis.com/gresearch/quest/documents.jsonl. Note that this file is quite large
(899MB). The format is newline separated json dicts containing `title` and
`text`.
### Citation Information
```
@inproceedings{malaviya23expertqa,
title = {QUEST: A Retrieval Dataset of Entity-Seeking Queries with Implicit Set Operations},
author = {Chaitanya Malaviya and Peter Shaw and Ming-Wei Chang and Kenton Lee and Kristina Toutanova},
booktitle = {ACL},
year = {2023},
url = "https://arxiv.org/abs/2305.11694"
}
```
|
CyberHarem/mina_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mina/近衛ミナ/南 (Blue Archive)
This is the dataset of mina/近衛ミナ/南 (Blue Archive), containing 135 images and their tags.
The core tags of this character are `long_hair, breasts, halo, sunglasses, green_hair, hair_over_one_eye, eyewear_on_head, red_eyes, hair_ornament, large_breasts, hair_rings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 135 | 250.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 135 | 207.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 366 | 415.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mina_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, black_necktie, high-waist_pants, looking_at_viewer, white_shirt, black_gloves, black_pants, solo, jacket, long_sleeves, coat_on_shoulders, collared_shirt, simple_background, white_background, closed_mouth, hair_stick, striped_coat, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_necktie | high-waist_pants | looking_at_viewer | white_shirt | black_gloves | black_pants | solo | jacket | long_sleeves | coat_on_shoulders | collared_shirt | simple_background | white_background | closed_mouth | hair_stick | striped_coat | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-------------------|:--------------------|:--------------|:---------------|:--------------|:-------|:---------|:---------------|:--------------------|:-----------------|:--------------------|:-------------------|:---------------|:-------------|:---------------|:----------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_ZhangShenao__0.001_idpo_declr_4iters_iter_4 | ---
pretty_name: Evaluation run of ZhangShenao/0.001_idpo_declr_4iters_iter_4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZhangShenao/0.001_idpo_declr_4iters_iter_4](https://huggingface.co/ZhangShenao/0.001_idpo_declr_4iters_iter_4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZhangShenao__0.001_idpo_declr_4iters_iter_4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T08:47:41.359633](https://huggingface.co/datasets/open-llm-leaderboard/details_ZhangShenao__0.001_idpo_declr_4iters_iter_4/blob/main/results_2024-04-08T08-47-41.359633.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6028509234883918,\n\
\ \"acc_stderr\": 0.033138602505692234,\n \"acc_norm\": 0.6092684933664486,\n\
\ \"acc_norm_stderr\": 0.0338311840506749,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5100933383252404,\n\
\ \"mc2_stderr\": 0.015980811380994712\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578274,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.014131176760131174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6598287193786099,\n\
\ \"acc_stderr\": 0.0047279834341954945,\n \"acc_norm\": 0.8509261103365864,\n\
\ \"acc_norm_stderr\": 0.003554333976897245\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367405,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367405\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469543,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469543\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885113,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885113\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913912,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913912\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32849162011173183,\n\
\ \"acc_stderr\": 0.015707935398496454,\n \"acc_norm\": 0.32849162011173183,\n\
\ \"acc_norm_stderr\": 0.015707935398496454\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02736359328468497,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02736359328468497\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882117,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547231,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547231\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.019594021136577443,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.019594021136577443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242304,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242304\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5100933383252404,\n\
\ \"mc2_stderr\": 0.015980811380994712\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.01176414905469833\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27369219105382864,\n \
\ \"acc_stderr\": 0.012281003490963458\n }\n}\n```"
repo_url: https://huggingface.co/ZhangShenao/0.001_idpo_declr_4iters_iter_4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|arc:challenge|25_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|gsm8k|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hellaswag|10_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T08-47-41.359633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T08-47-41.359633.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- '**/details_harness|winogrande|5_2024-04-08T08-47-41.359633.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T08-47-41.359633.parquet'
- config_name: results
data_files:
- split: 2024_04_08T08_47_41.359633
path:
- results_2024-04-08T08-47-41.359633.parquet
- split: latest
path:
- results_2024-04-08T08-47-41.359633.parquet
---
# Dataset Card for Evaluation run of ZhangShenao/0.001_idpo_declr_4iters_iter_4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZhangShenao/0.001_idpo_declr_4iters_iter_4](https://huggingface.co/ZhangShenao/0.001_idpo_declr_4iters_iter_4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZhangShenao__0.001_idpo_declr_4iters_iter_4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T08:47:41.359633](https://huggingface.co/datasets/open-llm-leaderboard/details_ZhangShenao__0.001_idpo_declr_4iters_iter_4/blob/main/results_2024-04-08T08-47-41.359633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6028509234883918,
"acc_stderr": 0.033138602505692234,
"acc_norm": 0.6092684933664486,
"acc_norm_stderr": 0.0338311840506749,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5100933383252404,
"mc2_stderr": 0.015980811380994712
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578274,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.014131176760131174
},
"harness|hellaswag|10": {
"acc": 0.6598287193786099,
"acc_stderr": 0.0047279834341954945,
"acc_norm": 0.8509261103365864,
"acc_norm_stderr": 0.003554333976897245
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.04971358884367405,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.04971358884367405
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469543,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885113,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080445,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913912,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913912
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32849162011173183,
"acc_stderr": 0.015707935398496454,
"acc_norm": 0.32849162011173183,
"acc_norm_stderr": 0.015707935398496454
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02736359328468497,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02736359328468497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882117,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547231,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547231
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.019594021136577443,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.019594021136577443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242304,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242304
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5100933383252404,
"mc2_stderr": 0.015980811380994712
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.01176414905469833
},
"harness|gsm8k|5": {
"acc": 0.27369219105382864,
"acc_stderr": 0.012281003490963458
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jstoone/donate_a_cry | ---
license: mit
task_categories:
- audio-classification
pretty_name: Donate A Cry Corpus
--- |
Bastao/VeraCruz_PT-BR | ---
configs:
- config_name: Portugal (PT)
data_files: pt/*.parquet
- config_name: Brazil (BR)
data_files: br/*.parquet
- config_name: Other
data_files: other/*.parquet
task_categories:
- text-generation
- text-classification
language:
- pt
tags:
- pt
- br
- portuguese
- brazilian
- portugal
- brazil
size_categories:
- 100M<n<1B
---
# Dataset Summary
The VeraCruz Dataset is a comprehensive collection of Portuguese language content, showcasing the linguistic and cultural diversity of of Portuguese-speaking regions. It includes around 190 million samples, organized by regional origin as indicated by URL metadata into primary categories. The primary categories are:
- **Portugal (PT)**: Samples with content URLs indicating a clear Portuguese origin.
- **Brazil (BR)**: Samples with content URLs indicating a clear Brazilian origin.
- **Other**: Samples where the URL metadata does not clearly indicate a Portuguese or Brazilian origin. These samples were further classified into "PT" or "BR" categories using the [PeroVaz_PT-PTBR_Classifier](https://huggingface.co/Bastao/PeroVaz_PT-PTBR_Classifier), which is trained specifically to distinguish between the European and Brazilian variations of Portuguese.
Each entry in this category is supplemented with two extra columns: 'label' and 'score'.
The 'label' column indicates the predicted category (PT or BR), and the 'score' column represents the probability of the predicted label.
# Source Data
The VeraCruz Dataset is derived from the [MyCulturaX](https://huggingface.co/datasets/uonlp/CulturaX) dataset's Portuguese language segment, a comprehensive collection known for its broad linguistic coverage across multiple languages.
However, the original [MyCulturaX](https://huggingface.co/datasets/uonlp/CulturaX) dataset does not differentiate between the two variants of Portuguese.
# Personal and Sensitive Information
Given the dataset's extensive nature, it may contain personal and sensitive information. Users are advised to handle the data responsibly, employing ethical practices and privacy-compliant measures such as data anonymization where necessary. It is crucial to respect individual privacy and adhere to legal standards when utilizing this dataset.
# Licensing Information
The license terms for the VeraCruz Dataset strictly follow those of mC4 and OSCAR. Please refer to the licenses of both datasets when using VeraCruz:
- [mC4 License Details](https://huggingface.co/datasets/allenai/c4#license)
- [OSCAR License Details](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information) |
Francesco-A/github-issues_huggingface-datasets | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
dtype: int64
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: closed_at
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
- name: comments_text
sequence: string
splits:
- name: train
num_bytes: 19873685.857377857
num_examples: 4863
- name: test
num_bytes: 4969443.142622142
num_examples: 1216
download_size: 8711986
dataset_size: 24843129.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "github-issues_huggingface-datasets"
**Dataset Name**: *GitHub Issues from Hugging Face Datasets*
**Description**:
The "github-issues_huggingface-datasets" dataset is a corpus of GitHub issues extracted from the Hugging Face Datasets repository. It includes valuable information and metadata related to the issues, such as titles, descriptions, labels, states, comments, and whether they are pull requests. The dataset was compiled using the GitHub REST API, which enabled the retrieval of issues and their corresponding comments. Additionally, a new column was added to indicate whether an issue is a pull request.
**Dataset Contents**:
The dataset consists of two splits:
1. Train Split: Contains 4,863 records, each with features like URL, repository URL, labels URL, comments URL, HTML URL, ID, node ID, issue number, Title, labels, state, locked status, milestone, comments, creation date, update date, closing date, reactions, timeline URL, and more.
2. Test Split: Comprises 1,216 records with the same features as the train split.
**Potential Uses**:
This dataset is valuable for various purposes, such as:
* Semantic search: Analyzing and retrieving issues based on semantic similarity.
* Multilabel classification: Classifying issues into multiple categories based on their labels.
* Exploratory analysis: Gaining insights into the trends and patterns within GitHub issues.
**Limitations and Risks**:
Users of this dataset should be aware of potential limitations, such as data incompleteness, bias in issue labeling, or outdated information. Additionally, data privacy and ethical considerations should be taken into account when using GitHub issues data.
**Access**:
The dataset is openly accessible to anyone interested in using it for research, analysis, or any other suitable applications. The dataset is publicly available for download and usage.
**Note**:
Certain user-specific features, such as "user", "author_association", "assignee" and "assignees" have been excluded from the dataset to protect individual privacy and mitigate the risk of identifying users or contributors.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tangjw20/brainTumor | ---
license: openrail
---
|
Multimodal-Fatima/OxfordFlowers_test_facebook_opt_1.3b_Attributes_Caption_ns_6149 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 267305744.375
num_examples: 6149
- name: fewshot_1_bs_16
num_bytes: 269129531.375
num_examples: 6149
- name: fewshot_3_bs_16
num_bytes: 272760442.375
num_examples: 6149
download_size: 796855399
dataset_size: 809195718.125
---
# Dataset Card for "OxfordFlowers_test_facebook_opt_1.3b_Attributes_Caption_ns_6149"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_272 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 761330380
num_examples: 149515
download_size: 776252984
dataset_size: 761330380
---
# Dataset Card for "chunk_272"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pbwinter/hindi_masked_chars | ---
dataset_info:
features:
- name: text
dtype: string
- name: mask_text
dtype: string
splits:
- name: train
num_bytes: 982624255.6601729
num_examples: 123893
- name: test
num_bytes: 245662012.3398271
num_examples: 30974
download_size: 503748402
dataset_size: 1228286268.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/onozuka_komachi_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of onozuka_komachi/小野塚小町 (Touhou)
This is the dataset of onozuka_komachi/小野塚小町 (Touhou), containing 500 images and their tags.
The core tags of this character are `red_hair, two_side_up, hair_ornament, red_eyes, short_hair, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 609.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onozuka_komachi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 379.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onozuka_komachi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1114 | 735.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onozuka_komachi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 557.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onozuka_komachi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1114 | 984.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onozuka_komachi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/onozuka_komachi_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, hair_bobbles, scythe, solo, smile, cleavage |
| 1 | 11 |  |  |  |  |  | 1girl, hair_bobbles, scythe, solo, spider_lily |
| 2 | 7 |  |  |  |  |  | 1girl, hair_bobbles, scythe, solo, cleavage, smile, spider_lily |
| 3 | 5 |  |  |  |  |  | 1girl, bangs, blue_dress, hair_bobbles, looking_at_viewer, obi, puffy_short_sleeves, smile, solo, coin, holding_scythe, open_mouth |
| 4 | 11 |  |  |  |  |  | 1girl, blue_dress, full_body, hair_bobbles, holding_scythe, obi, puffy_short_sleeves, solo, looking_at_viewer, tabi, bangs, coin, simple_background, smile, standing, white_socks, closed_mouth, white_background, sandals, blue_kimono, cleavage, frills |
| 5 | 15 |  |  |  |  |  | 2girls, hair_bobbles, green_hair, hat, scythe, smile, flower, cleavage, rod_of_remorse |
| 6 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hair_bobbles, hetero, solo_focus, penis, nipples, smile, cum, huge_breasts, nude, paizuri, pov, looking_at_viewer, mosaic_censoring, pink_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hair_bobbles | scythe | solo | smile | cleavage | spider_lily | bangs | blue_dress | looking_at_viewer | obi | puffy_short_sleeves | coin | holding_scythe | open_mouth | full_body | tabi | simple_background | standing | white_socks | closed_mouth | white_background | sandals | blue_kimono | frills | 2girls | green_hair | hat | flower | rod_of_remorse | 1boy | blush | hetero | solo_focus | penis | nipples | cum | huge_breasts | nude | paizuri | pov | mosaic_censoring | pink_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------|:-------|:--------|:-----------|:--------------|:--------|:-------------|:--------------------|:------|:----------------------|:-------|:-----------------|:-------------|:------------|:-------|:--------------------|:-----------|:--------------|:---------------|:-------------------|:----------|:--------------|:---------|:---------|:-------------|:------|:---------|:-----------------|:-------|:--------|:---------|:-------------|:--------|:----------|:------|:---------------|:-------|:----------|:------|:-------------------|:------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | X | | X | X | X | | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 5 | 15 |  |  |  |  |  | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mtkinit/TestovanieFEasdasd | ---
pretty_name: TestovanieFEasdasd
---
# TestovanieFEasdasd
Created from AIOD platform |
aneeshas/imsdb-500tokendrama-movie-scripts | ---
dataset_info:
features:
- name: Drama
dtype: string
splits:
- name: train
num_bytes: 307903
num_examples: 652
download_size: 189402
dataset_size: 307903
---
# Dataset Card for "imsdb-500tokendrama-movie-scripts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sravaniayyagari/dataset-with-empty-and-duplicatekeys | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 3092456
num_examples: 1722
- name: validation
num_bytes: 323980
num_examples: 189
download_size: 433088
dataset_size: 3416436
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
jamestalentium/cnn_dailymail_10_finetune | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 43944.50216465294
num_examples: 10
download_size: 25357
dataset_size: 43944.50216465294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cnn_dailymail_10_finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Abhijeet3922/ESG-Prospectus-Clarity-Category | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
- zero-shot-classification
language:
- en
tags:
- finance
size_categories:
- 1K<n<10K
configs:
- config_name: esg-prospectus-clarity-category
data_files: "esg-prospectus-clarity-category.csv"
- config_name: esg-prospectus-clarity-granular-category
data_files: "esg-prospectus-clarity-granular-category.csv"
---
# Dataset Card for ESG-Prospectus-Clarity-Category
### Dataset Summary
This dataset is manually annotated quality training dataset of 1155 ESG language instances (4 classes), obtained via a data extraction pipeline from summary prospectuses of sustainable (ESG) funds.
The ESG sentences extracted from ‘Principal Investment Strategy’ sections of the documents. Following are the four classes.
1. Specific ESG Language
2. Ambiguous ESG Language
3. Generic ESG language
4. Risk ESG language
All the instances are related to ESG investment language present in prospectus of funds. Further all instances were annotated for language clarity classes.
### Supported Tasks and Leaderboards
Text Classification (Language style classification)
Few Shot Classification
### Languages
English
## Dataset Structure
### Data Instances
Total instances: 1155
classwise instances:
'Specific ESG': 320
'Ambiguous ESG': 283
'Generic ESG': 264
'Risk ESG': 288
### Data Fields
```
{ "Text": "The Sub-fund's weighted carbon footprint score is equal or better than that of the Custom Bloomberg Climate Transition Benchmark.",
"Label": "specific"
"Text": "The Sub-fund invests a minimum of 5% in green, social, sustainable, and/or sustainability-linked bonds.",
"Label": "specific"
"Text": "The Fund will seek to invest in companies with sustainable business models which have a strong consideration for ESG risks and opportunities.",
"Label": "ambiguous"
}
```
### Data Splits
There's no train/validation/test split.
However the dataset is available two level of categorizations:
`esg-prospectus-clarity-category.csv`: Number of classes: 4 ('specific', 'ambiguous', 'generic', 'risk')
`esg-prospectus-clarity-granular-category.csv`: Number of classes: 7 ('specific', 'ambiguous', 'generic', 'general-risk', 'performance-risk', 'data-risk', 'disclaimer-risk')
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The process begins with downloading the public ‘Summary Prospectuses’ from literature sections of the official websites of various Asset Management Companies (AMCs).
We collected approximately 250 sustainable products prospectuses.
#### Who are the source language producers?
The source data was written and published by various fund issuers (Asset Management Companies).
### Annotations
#### Annotation process
The dataset was divided into three subsets and each annotator was allocated 2 subset of sentences and was given few weeks to label the sentences.
Consequently, each of the 1155 instances was annotated by 2 annotators. We release standard dataset of sentences after 100% agreement.
#### Who are the annotators?
The open-sourced dataset was annotated by 3 people with adequate knowledge of ESG investing and were fluent in English with previous exposure of analyzing financial documents.
## Considerations for Using the Data
The dataset can be used to investigate the transparency in sustainability intention of language mentioned in ESG disclosures of sustainable funds.
### Discussion of Biases
The data instances might cover languages from certain fund issuers (not all). It was extracted from randomly chosen prospectuses from the collected corpus.
The dataset might be revised with broader coverage of prospectus language in future.
### Licensing Information
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/.
If you are interested in commercial use of the data, please contact the following author for an appropriate license:
- [Abhijeet Kumar](mailto:abhijeet.kumar@fmr.com)
### Citation Information
[More Information Needed]
### Contributions
Thanks to [Nazia Nafis](https://www.linkedin.com/in/nazianafis/) and [Mayank Singh](https://www.linkedin.com/in/mayank-singh-43761b155/) for contributing to the dataset creation process.
Any contribution or further research by the community are welcome. |
UmarRamzan/test | ---
dataset_info:
features:
- name: input_length
dtype: int64
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
- name: labels_length
dtype: int64
splits:
- name: train
num_bytes: 38425584
num_examples: 40
- name: validation
num_bytes: 38425640
num_examples: 40
download_size: 14431280
dataset_size: 76851224
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/11b6b1a6 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1335
dataset_size: 188
---
# Dataset Card for "11b6b1a6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_20 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1415020972.0
num_examples: 277891
download_size: 1441487121
dataset_size: 1415020972.0
---
# Dataset Card for "chunk_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Anusha64/new-events | ---
license: mit
dataset_info:
features:
- name: Date_of_event
dtype: string
- name: Event
dtype: string
splits:
- name: train
num_bytes: 11879
num_examples: 62
- name: validation
num_bytes: 7242
num_examples: 31
download_size: 17352
dataset_size: 19121
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
vietgpt/OSCAR-2109 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: url
dtype: string
- name: date
dtype: string
- name: perplexity
dtype: float64
splits:
- name: train
num_bytes: 16802536783.756039
num_examples: 5098334
download_size: 8245526034
dataset_size: 16802536783.756039
---
# Dataset Card for "OSCAR-2109"
Num tokens: 2,884,522,212 tokens |
BangumiBase/kamisamakiss | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Kamisama Kiss
This is the image base of bangumi Kamisama Kiss, we detected 50 characters, 2686 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 11 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 65 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 40 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 24 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 59 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 63 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 50 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 20 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 122 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 122 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 544 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 176 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 18 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 33 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 29 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 22 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 32 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 16 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 27 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 257 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 31 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 14 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 41 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 10 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 25 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 17 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 106 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 12 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 86 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 16 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 9 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 10 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 51 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 26 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 11 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 25 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 8 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 10 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 28 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 40 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 9 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 10 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 9 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 11 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 16 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 12 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 32 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 8 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 6 | [Download](48/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 267 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
McGilbertus/infoleg_10k_ner | ---
license: unknown
---
words in documents from infoLeg (normas y leyes argentinas), con tags de acuerdo a la siguiente tabla:
0 0 - no usado
* tipo de norma
1 B-NOR
2 I-NOR
* organizaciones
3 B-ORG
4 I-ORG
* repositorio
5 B-REP
6 I-REP
* fechas
7 B-FEC
8 I-FEC
* entidades miscelaneas
9 B-MISC
10 I-MISC |
CyberHarem/kurokoma_saki_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kurokoma_saki (Touhou)
This is the dataset of kurokoma_saki (Touhou), containing 42 images and their tags.
The core tags of this character are `wings, black_hair, red_eyes, hat, black_wings, cowboy_hat, long_hair, bangs, brown_headwear, feathered_wings, breasts, hair_between_eyes, ponytail, tail, medium_breasts, horse_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 58.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 35.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 102 | 71.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 51.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 102 | 96.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurokoma_saki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kurokoma_saki_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bandana, bare_shoulders, blue_shirt, brown_skirt, looking_at_viewer, off-shoulder_shirt, overskirt, solo, cleavage, cowboy_shot, hand_up, miniskirt, puffy_short_sleeves, standing, feathers, thighs, :d, large_breasts, open_mouth, plaid_skirt, pleated_skirt, short_hair, very_long_hair |
| 1 | 7 |  |  |  |  |  | 1girl, brown_footwear, brown_skirt, looking_at_viewer, overskirt, smile, solo, blue_shirt, puffy_short_sleeves, plaid, simple_background, white_background, bandana, closed_mouth, full_body, hand_on_headwear, hand_up, off-shoulder_shirt, bare_shoulders, knee_boots, multicolored_clothes, scarf |
| 2 | 6 |  |  |  |  |  | 1girl, bandana, blue_shirt, looking_at_viewer, solo, upper_body, bare_shoulders, off-shoulder_shirt, puffy_short_sleeves, simple_background, blush, grin, cleavage, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bandana | bare_shoulders | blue_shirt | brown_skirt | looking_at_viewer | off-shoulder_shirt | overskirt | solo | cleavage | cowboy_shot | hand_up | miniskirt | puffy_short_sleeves | standing | feathers | thighs | :d | large_breasts | open_mouth | plaid_skirt | pleated_skirt | short_hair | very_long_hair | brown_footwear | smile | plaid | simple_background | white_background | closed_mouth | full_body | hand_on_headwear | knee_boots | multicolored_clothes | scarf | upper_body | blush | grin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-----------------|:-------------|:--------------|:--------------------|:---------------------|:------------|:-------|:-----------|:--------------|:----------|:------------|:----------------------|:-----------|:-----------|:---------|:-----|:----------------|:-------------|:--------------|:----------------|:-------------|:-----------------|:-----------------|:--------|:--------|:--------------------|:-------------------|:---------------|:------------|:-------------------|:-------------|:-----------------------|:--------|:-------------|:--------|:-------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | X | | X | X | | | | X | | | | | | | | | | | | | | X | X | | | | | | | X | X | X |
|
rvox/amanhecer | ---
license: openrail
---
|
open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare | ---
pretty_name: Evaluation run of louisbrulenaudet/Pearl-34B-dare
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [louisbrulenaudet/Pearl-34B-dare](https://huggingface.co/louisbrulenaudet/Pearl-34B-dare)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T04:00:24.953384](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare/blob/main/results_2024-02-13T04-00-24.953384.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7587346915989489,\n\
\ \"acc_stderr\": 0.028351398155327497,\n \"acc_norm\": 0.763853167003337,\n\
\ \"acc_norm_stderr\": 0.028878179543793354,\n \"mc1\": 0.5116279069767442,\n\
\ \"mc1_stderr\": 0.017498767175740084,\n \"mc2\": 0.6850136264565471,\n\
\ \"mc2_stderr\": 0.014412881216443527\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620192,\n\
\ \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n\
\ \"acc_stderr\": 0.0047916019756127646,\n \"acc_norm\": 0.8360884285998805,\n\
\ \"acc_norm_stderr\": 0.003694387361177659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n\
\ \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n\
\ \"acc_stderr\": 0.026293995855474945,\n \"acc_norm\": 0.881578947368421,\n\
\ \"acc_norm_stderr\": 0.026293995855474945\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n\
\ \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n\
\ \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.024774516250440182,\n\
\ \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.024774516250440182\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n\
\ \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n\
\ \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n\
\ \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.774468085106383,\n\
\ \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.7379310344827587,\n \"acc_stderr\": 0.03664666337225257,\n \"\
acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.03664666337225257\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7248677248677249,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.7248677248677249,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n\
\ \"acc_stderr\": 0.01730838128103452,\n \"acc_norm\": 0.896774193548387,\n\
\ \"acc_norm_stderr\": 0.01730838128103452\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.019776601086550032,\n\
\ \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.019776601086550032\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4740740740740741,\n \"acc_stderr\": 0.03044452852881074,\n \
\ \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.03044452852881074\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02476290267805793,\n \
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02476290267805793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"\
acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"\
acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.01849831520686538,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.01849831520686538\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9029374201787995,\n\
\ \"acc_stderr\": 0.0105864747120183,\n \"acc_norm\": 0.9029374201787995,\n\
\ \"acc_norm_stderr\": 0.0105864747120183\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575277,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8078212290502793,\n\
\ \"acc_stderr\": 0.013177759505210093,\n \"acc_norm\": 0.8078212290502793,\n\
\ \"acc_norm_stderr\": 0.013177759505210093\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362303,\n\
\ \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571846,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6347517730496454,\n \"acc_stderr\": 0.02872386385328127,\n \
\ \"acc_norm\": 0.6347517730496454,\n \"acc_norm_stderr\": 0.02872386385328127\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5743155149934811,\n\
\ \"acc_stderr\": 0.012628393551811942,\n \"acc_norm\": 0.5743155149934811,\n\
\ \"acc_norm_stderr\": 0.012628393551811942\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.023345163616544855,\n\
\ \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.023345163616544855\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8104575163398693,\n \"acc_stderr\": 0.015856152189980252,\n \
\ \"acc_norm\": 0.8104575163398693,\n \"acc_norm_stderr\": 0.015856152189980252\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n\
\ \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5116279069767442,\n\
\ \"mc1_stderr\": 0.017498767175740084,\n \"mc2\": 0.6850136264565471,\n\
\ \"mc2_stderr\": 0.014412881216443527\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267198\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \
\ \"acc_stderr\": 0.013258428375662245\n }\n}\n```"
repo_url: https://huggingface.co/louisbrulenaudet/Pearl-34B-dare
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|arc:challenge|25_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|gsm8k|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hellaswag|10_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T04-00-24.953384.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- '**/details_harness|winogrande|5_2024-02-13T04-00-24.953384.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T04-00-24.953384.parquet'
- config_name: results
data_files:
- split: 2024_02_13T04_00_24.953384
path:
- results_2024-02-13T04-00-24.953384.parquet
- split: latest
path:
- results_2024-02-13T04-00-24.953384.parquet
---
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-dare
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-34B-dare](https://huggingface.co/louisbrulenaudet/Pearl-34B-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T04:00:24.953384](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare/blob/main/results_2024-02-13T04-00-24.953384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7587346915989489,
"acc_stderr": 0.028351398155327497,
"acc_norm": 0.763853167003337,
"acc_norm_stderr": 0.028878179543793354,
"mc1": 0.5116279069767442,
"mc1_stderr": 0.017498767175740084,
"mc2": 0.6850136264565471,
"mc2_stderr": 0.014412881216443527
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620192,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815291
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.0047916019756127646,
"acc_norm": 0.8360884285998805,
"acc_norm_stderr": 0.003694387361177659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474945,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7379310344827587,
"acc_stderr": 0.03664666337225257,
"acc_norm": 0.7379310344827587,
"acc_norm_stderr": 0.03664666337225257
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7248677248677249,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.7248677248677249,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103452,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103452
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.019776601086550032,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.019776601086550032
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.03044452852881074,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.03044452852881074
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02476290267805793,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02476290267805793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116245,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.01849831520686538,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.01849831520686538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9029374201787995,
"acc_stderr": 0.0105864747120183,
"acc_norm": 0.9029374201787995,
"acc_norm_stderr": 0.0105864747120183
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575277,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8078212290502793,
"acc_stderr": 0.013177759505210093,
"acc_norm": 0.8078212290502793,
"acc_norm_stderr": 0.013177759505210093
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362303,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571846,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6347517730496454,
"acc_stderr": 0.02872386385328127,
"acc_norm": 0.6347517730496454,
"acc_norm_stderr": 0.02872386385328127
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5743155149934811,
"acc_stderr": 0.012628393551811942,
"acc_norm": 0.5743155149934811,
"acc_norm_stderr": 0.012628393551811942
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.023345163616544855,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.023345163616544855
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8104575163398693,
"acc_stderr": 0.015856152189980252,
"acc_norm": 0.8104575163398693,
"acc_norm_stderr": 0.015856152189980252
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5116279069767442,
"mc1_stderr": 0.017498767175740084,
"mc2": 0.6850136264565471,
"mc2_stderr": 0.014412881216443527
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267198
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662245
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/sariel_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sariel/サリエル (Touhou)
This is the dataset of sariel/サリエル (Touhou), containing 45 images and their tags.
The core tags of this character are `long_hair, wings, multiple_wings, angel_wings, very_long_hair, blue_hair, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 42.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 29.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 73 | 44.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 39.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 73 | 56.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sariel_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, staff, closed_eyes, long_sleeves, blue_dress, smile |
| 1 | 6 |  |  |  |  |  | 1girl, blue_dress, long_sleeves, solo, breasts, closed_mouth, feathered_wings, looking_at_viewer, smile, white_wings, wide_sleeves, holding, angel, bangs, blush, staff, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | staff | closed_eyes | long_sleeves | blue_dress | smile | breasts | closed_mouth | feathered_wings | looking_at_viewer | white_wings | wide_sleeves | holding | angel | bangs | blush | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------|:---------------|:-------------|:--------|:----------|:---------------|:------------------|:--------------------|:--------------|:---------------|:----------|:--------|:--------|:--------|:--------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_yam-peleg__Experiment22-7B | ---
pretty_name: Evaluation run of yam-peleg/Experiment22-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yam-peleg/Experiment22-7B](https://huggingface.co/yam-peleg/Experiment22-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment22-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T17:14:31.869913](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment22-7B/blob/main/results_2024-02-22T17-14-31.869913.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473088681209457,\n\
\ \"acc_stderr\": 0.03225485470358198,\n \"acc_norm\": 0.6467816374881623,\n\
\ \"acc_norm_stderr\": 0.03293070375242111,\n \"mc1\": 0.6438188494492044,\n\
\ \"mc1_stderr\": 0.016763790728446325,\n \"mc2\": 0.7947465358188641,\n\
\ \"mc2_stderr\": 0.013382856229328008\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244484,\n\
\ \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838793\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7133041226847242,\n\
\ \"acc_stderr\": 0.004512940497462743,\n \"acc_norm\": 0.8888667596096396,\n\
\ \"acc_norm_stderr\": 0.003136547276689888\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077839,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077839\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.024137632429337714,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.024137632429337714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.026156867523931048,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.026156867523931048\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579665,\n\
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579665\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973127,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973127\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n\
\ \"acc_stderr\": 0.016525425898773503,\n \"acc_norm\": 0.423463687150838,\n\
\ \"acc_norm_stderr\": 0.016525425898773503\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863935,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863935\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6438188494492044,\n\
\ \"mc1_stderr\": 0.016763790728446325,\n \"mc2\": 0.7947465358188641,\n\
\ \"mc2_stderr\": 0.013382856229328008\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.0100992082460656\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6664139499620925,\n \
\ \"acc_stderr\": 0.012987282131410809\n }\n}\n```"
repo_url: https://huggingface.co/yam-peleg/Experiment22-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|arc:challenge|25_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|gsm8k|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hellaswag|10_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T17-14-31.869913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T17-14-31.869913.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- '**/details_harness|winogrande|5_2024-02-22T17-14-31.869913.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T17-14-31.869913.parquet'
- config_name: results
data_files:
- split: 2024_02_22T17_14_31.869913
path:
- results_2024-02-22T17-14-31.869913.parquet
- split: latest
path:
- results_2024-02-22T17-14-31.869913.parquet
---
# Dataset Card for Evaluation run of yam-peleg/Experiment22-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment22-7B](https://huggingface.co/yam-peleg/Experiment22-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment22-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T17:14:31.869913](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment22-7B/blob/main/results_2024-02-22T17-14-31.869913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6473088681209457,
"acc_stderr": 0.03225485470358198,
"acc_norm": 0.6467816374881623,
"acc_norm_stderr": 0.03293070375242111,
"mc1": 0.6438188494492044,
"mc1_stderr": 0.016763790728446325,
"mc2": 0.7947465358188641,
"mc2_stderr": 0.013382856229328008
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244484,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838793
},
"harness|hellaswag|10": {
"acc": 0.7133041226847242,
"acc_stderr": 0.004512940497462743,
"acc_norm": 0.8888667596096396,
"acc_norm_stderr": 0.003136547276689888
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.02537952491077839,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.02537952491077839
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.024137632429337714,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.024137632429337714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579665,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579665
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973127,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973127
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773503,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863935,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863935
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6438188494492044,
"mc1_stderr": 0.016763790728446325,
"mc2": 0.7947465358188641,
"mc2_stderr": 0.013382856229328008
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.0100992082460656
},
"harness|gsm8k|5": {
"acc": 0.6664139499620925,
"acc_stderr": 0.012987282131410809
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_KnutJaegersberg__Walter-Mistral-7B | ---
pretty_name: Evaluation run of KnutJaegersberg/Walter-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Walter-Mistral-7B](https://huggingface.co/KnutJaegersberg/Walter-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Walter-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T13:56:54.383446](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Mistral-7B/blob/main/results_2023-12-18T13-56-54.383446.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5800043815414543,\n\
\ \"acc_stderr\": 0.03295379522396862,\n \"acc_norm\": 0.590722838232918,\n\
\ \"acc_norm_stderr\": 0.03383616633951902,\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.399340955094558,\n\
\ \"mc2_stderr\": 0.013888761310440584\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5469283276450512,\n \"acc_stderr\": 0.014546892052005626,\n\
\ \"acc_norm\": 0.5887372013651877,\n \"acc_norm_stderr\": 0.01437944106852208\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6326428998207528,\n\
\ \"acc_stderr\": 0.0048109966523247295,\n \"acc_norm\": 0.8342959569806812,\n\
\ \"acc_norm_stderr\": 0.0037105487209054206\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981762,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981762\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.02510682066053976,\n \
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.02510682066053976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7761467889908257,\n\
\ \"acc_stderr\": 0.017871217767790232,\n \"acc_norm\": 0.7761467889908257,\n\
\ \"acc_norm_stderr\": 0.017871217767790232\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n\
\ \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693247,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693247\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.015046301846691814,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.015046301846691814\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.013378001241813056,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.013378001241813056\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547231,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547231\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215927,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215927\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.031512360446742695,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.031512360446742695\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.399340955094558,\n\
\ \"mc2_stderr\": 0.013888761310440584\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838241\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225181\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Walter-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|arc:challenge|25_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|gsm8k|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hellaswag|10_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T13-56-54.383446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T13-56-54.383446.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- '**/details_harness|winogrande|5_2023-12-18T13-56-54.383446.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T13-56-54.383446.parquet'
- config_name: results
data_files:
- split: 2023_12_18T13_56_54.383446
path:
- results_2023-12-18T13-56-54.383446.parquet
- split: latest
path:
- results_2023-12-18T13-56-54.383446.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Walter-Mistral-7B](https://huggingface.co/KnutJaegersberg/Walter-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Walter-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T13:56:54.383446](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Mistral-7B/blob/main/results_2023-12-18T13-56-54.383446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5800043815414543,
"acc_stderr": 0.03295379522396862,
"acc_norm": 0.590722838232918,
"acc_norm_stderr": 0.03383616633951902,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.399340955094558,
"mc2_stderr": 0.013888761310440584
},
"harness|arc:challenge|25": {
"acc": 0.5469283276450512,
"acc_stderr": 0.014546892052005626,
"acc_norm": 0.5887372013651877,
"acc_norm_stderr": 0.01437944106852208
},
"harness|hellaswag|10": {
"acc": 0.6326428998207528,
"acc_stderr": 0.0048109966523247295,
"acc_norm": 0.8342959569806812,
"acc_norm_stderr": 0.0037105487209054206
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981762,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981762
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.02510682066053976,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.02510682066053976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790232,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790232
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693247,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693247
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691814,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691814
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.026074314851657083,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.026074314851657083
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2,
"acc_stderr": 0.013378001241813056,
"acc_norm": 0.2,
"acc_norm_stderr": 0.013378001241813056
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547231,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547231
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215927,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215927
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.031512360446742695,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.031512360446742695
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.399340955094558,
"mc2_stderr": 0.013888761310440584
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838241
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225181
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pccl-org/formal-logic-simple-order-simple-objects-blivergent-500 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
- name: index
dtype: int64
splits:
- name: train
num_bytes: 19635650
num_examples: 124750
download_size: 3888871
dataset_size: 19635650
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "formal-logic-simple-order-simple-objects-blivergent-500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/m1895_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of m1895/M1895/纳甘左轮 (Girls' Frontline)
This is the dataset of m1895/M1895/纳甘左轮 (Girls' Frontline), containing 204 images and their tags.
The core tags of this character are `blonde_hair, red_eyes, hat, bangs, long_hair, hair_between_eyes, fur_hat, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 204 | 331.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 204 | 154.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 531 | 375.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 204 | 275.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 531 | 577.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1895_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m1895_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, belt_buckle, blush, brown_gloves, brown_skirt, fingerless_gloves, handgun, holding_gun, jacket_on_shoulders, long_sleeves, revolver, solo, white_jacket, white_shirt, brown_belt, center_frills, looking_at_viewer, object_namesake, open_mouth, smile, black_gloves, black_socks, kneehighs, one_eye_closed, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, belt_buckle, blush, brown_skirt, center_frills, jacket_on_shoulders, long_sleeves, solo, white_jacket, white_shirt, fake_animal_ears, mod3_(girls'_frontline), pleated_skirt, animal_ear_fluff, animal_hat, brown_belt, looking_at_viewer, open_mouth, red_belt, white_background, brown_gloves, simple_background, single_glove, :d, fingerless_gloves, hand_on_hip, star_(symbol) |
| 2 | 8 |  |  |  |  |  | 1girl, blush, brown_belt, brown_skirt, center_frills, long_sleeves, white_jacket, white_shirt, belt_buckle, holding, simple_background, solo, black_gloves, brown_gloves, fingerless_gloves, jacket_on_shoulders, open_mouth, :d, grey_background, looking_at_viewer, ushanka, brown_background, closed_eyes, white_background |
| 3 | 9 |  |  |  |  |  | 1girl, blush, fingerless_gloves, solo, white_jacket, white_shirt, long_sleeves, upper_body, black_gloves, brown_gloves, open_mouth, simple_background, looking_at_viewer, white_background, center_frills, :d |
| 4 | 9 |  |  |  |  |  | 1girl, :d, black_headwear, blue_cape, blue_flower, hair_flower, mini_top_hat, official_alternate_costume, open_mouth, solo, tilted_headwear, blush, vertical_stripes, electric_guitar, holding_instrument, looking_at_viewer, puffy_short_sleeves, braid, mismatched_gloves, one_side_up, striped_gloves, black_shirt, elbow_gloves, fingerless_gloves, frills, holding_microphone, long_sleeves, upper_body, white_gloves, white_shirt, ascot, collared_shirt, jacket, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, blue_headwear, earrings, eyewear_on_head, official_alternate_costume, simple_background, sunglasses, ahoge, blue_flower, hair_flower, looking_at_viewer, smile, solo, white_background, bare_shoulders, blue_dress, blush, choker, closed_mouth, upper_body, black_dress, collarbone, mini_top_hat, single_hair_bun |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | belt_buckle | blush | brown_gloves | brown_skirt | fingerless_gloves | handgun | holding_gun | jacket_on_shoulders | long_sleeves | revolver | solo | white_jacket | white_shirt | brown_belt | center_frills | looking_at_viewer | object_namesake | open_mouth | smile | black_gloves | black_socks | kneehighs | one_eye_closed | white_background | fake_animal_ears | mod3_(girls'_frontline) | pleated_skirt | animal_ear_fluff | animal_hat | red_belt | simple_background | single_glove | :d | hand_on_hip | star_(symbol) | holding | grey_background | ushanka | brown_background | closed_eyes | upper_body | black_headwear | blue_cape | blue_flower | hair_flower | mini_top_hat | official_alternate_costume | tilted_headwear | vertical_stripes | electric_guitar | holding_instrument | puffy_short_sleeves | braid | mismatched_gloves | one_side_up | striped_gloves | black_shirt | elbow_gloves | frills | holding_microphone | white_gloves | ascot | collared_shirt | jacket | blue_headwear | earrings | eyewear_on_head | sunglasses | ahoge | bare_shoulders | blue_dress | choker | closed_mouth | black_dress | collarbone | single_hair_bun |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------|:---------------|:--------------|:--------------------|:----------|:--------------|:----------------------|:---------------|:-----------|:-------|:---------------|:--------------|:-------------|:----------------|:--------------------|:------------------|:-------------|:--------|:---------------|:--------------|:------------|:-----------------|:-------------------|:-------------------|:--------------------------|:----------------|:-------------------|:-------------|:-----------|:--------------------|:---------------|:-----|:--------------|:----------------|:----------|:------------------|:----------|:-------------------|:--------------|:-------------|:-----------------|:------------|:--------------|:--------------|:---------------|:-----------------------------|:------------------|:-------------------|:------------------|:---------------------|:----------------------|:--------|:--------------------|:--------------|:-----------------|:--------------|:---------------|:---------|:---------------------|:---------------|:--------|:-----------------|:---------|:----------------|:-----------|:------------------|:-------------|:--------|:-----------------|:-------------|:---------|:---------------|:--------------|:-------------|:------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | | X | | X | | | | X | | | | | | | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | X | | X | | | | X | | X | X | X | | X | X | | X | | X | | | | X | | | | | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | | | X | | | | X | | X | | X | | | X | | X | | | | | | X | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | | | | | | | X | | | | | X | | | X | | | | | X | | | | | | | X | | | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
K1ngAi/game_changer_Dataset | ---
license: openrail
---
|
MicPie/unpredictable_dummies-com | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-dummies-com
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-dummies-com" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
Atipico1/mrqa-test-final-set-v2-new_question-demon-new_question | ---
dataset_info:
features:
- name: subset
dtype: string
- name: qid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: masked_query
dtype: string
- name: context
dtype: string
- name: answer_sent
dtype: string
- name: answer_in_context
sequence: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: clear_answer_sent
dtype: string
- name: vague_answer_sent
dtype: string
- name: adversary
dtype: string
- name: replace_count
dtype: int64
- name: adversarial_passage
dtype: string
- name: masked_answer_sent
dtype: string
- name: num_mask_token
dtype: int64
- name: entities
sequence: string
- name: gpt_adv_sent
dtype: string
- name: is_same
dtype: string
- name: gpt_adv_sent_passage
dtype: string
- name: gpt_passage
dtype: string
- name: gpt_adv_sent_passage_demon
dtype: string
- name: new_question
dtype: string
splits:
- name: train
num_bytes: 2755943
num_examples: 684
download_size: 1762217
dataset_size: 2755943
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Amala/bil | ---
license: unknown
---
|
atmallen/qm_alice_hard_4_grader_first_1.0e | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 3455633.0
num_examples: 37091
- name: validation
num_bytes: 369717.0
num_examples: 3969
- name: test
num_bytes: 365744.0
num_examples: 3926
download_size: 1063722
dataset_size: 4191094.0
---
# Dataset Card for "qm_alice_hard_4_grader_first_1.0e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mitsudate/Multi-langsV4_Opencpop_Ver1 | ---
license: unknown
---
|
nguyentruong-ins/codeforces_cpp_cleaned_scaled_class | ---
dataset_info:
features:
- name: solution
dtype: string
- name: difficulty
dtype: int64
splits:
- name: train
num_bytes: 1402541089.9400597
num_examples: 1076270
- name: test
num_bytes: 175317962.02997017
num_examples: 134534
- name: valid
num_bytes: 175317962.02997017
num_examples: 134534
download_size: 736309198
dataset_size: 1753177014.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
open-llm-leaderboard/details_nbeerbower__bruphin-iota | ---
pretty_name: Evaluation run of nbeerbower/bruphin-iota
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/bruphin-iota](https://huggingface.co/nbeerbower/bruphin-iota) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__bruphin-iota\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:07:15.035652](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bruphin-iota/blob/main/results_2024-03-29T20-07-15.035652.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6542322911504688,\n\
\ \"acc_stderr\": 0.03203408210032273,\n \"acc_norm\": 0.6544574095861694,\n\
\ \"acc_norm_stderr\": 0.032690727546977244,\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6616821624309618,\n\
\ \"mc2_stderr\": 0.015204824370197082\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820167,\n\
\ \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.696176060545708,\n\
\ \"acc_stderr\": 0.004589676274079078,\n \"acc_norm\": 0.8654650468034256,\n\
\ \"acc_norm_stderr\": 0.003405288007233208\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948475,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371807,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371807\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.01651367603117959,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.01651367603117959\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.01274724896707906,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.01274724896707906\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487043,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6616821624309618,\n\
\ \"mc2_stderr\": 0.015204824370197082\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989245\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6777862016679302,\n \
\ \"acc_stderr\": 0.012872435481188778\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/bruphin-iota
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-07-15.035652.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-07-15.035652.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- '**/details_harness|winogrande|5_2024-03-29T20-07-15.035652.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-07-15.035652.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_07_15.035652
path:
- results_2024-03-29T20-07-15.035652.parquet
- split: latest
path:
- results_2024-03-29T20-07-15.035652.parquet
---
# Dataset Card for Evaluation run of nbeerbower/bruphin-iota
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/bruphin-iota](https://huggingface.co/nbeerbower/bruphin-iota) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__bruphin-iota",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:07:15.035652](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bruphin-iota/blob/main/results_2024-03-29T20-07-15.035652.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6542322911504688,
"acc_stderr": 0.03203408210032273,
"acc_norm": 0.6544574095861694,
"acc_norm_stderr": 0.032690727546977244,
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6616821624309618,
"mc2_stderr": 0.015204824370197082
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820167,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815291
},
"harness|hellaswag|10": {
"acc": 0.696176060545708,
"acc_stderr": 0.004589676274079078,
"acc_norm": 0.8654650468034256,
"acc_norm_stderr": 0.003405288007233208
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948475,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371807,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371807
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.01651367603117959,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.01651367603117959
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.01274724896707906,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.01274724896707906
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487043,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6616821624309618,
"mc2_stderr": 0.015204824370197082
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989245
},
"harness|gsm8k|5": {
"acc": 0.6777862016679302,
"acc_stderr": 0.012872435481188778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
griffin/seahorse_fewshot | ---
dataset_info:
features:
- name: gem_id
dtype: string
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 157355986
num_examples: 84795
- name: validation
num_bytes: 23274328
num_examples: 12513
- name: test
num_bytes: 52121809
num_examples: 25002
download_size: 28900357
dataset_size: 232752123
---
# Dataset Card for "seahorse_fewshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nirbhaysinghnarang/HinglishCognitiveReframing | ---
license: mit
task_categories:
- question-answering
language:
- hi
- en
pretty_name: Hinglish Cognitive Reframing
--- |
zolak/twitter_dataset_79_1713111575 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 226141
num_examples: 547
download_size: 118214
dataset_size: 226141
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-f87a1758-7384796 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- banking77
eval_info:
task: multi_class_classification
model: mrm8488/distilroberta-finetuned-banking77
dataset_name: banking77
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: mrm8488/distilroberta-finetuned-banking77
* Dataset: banking77
To run new evaluation jobs, visit Hugging Face's [automatic evaluation service](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
lexaizero/sloazqakfqy | ---
license: unknown
---
|
CyberHarem/ogasawara_haruka_soundeuphonium | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ogasawara Haruka/小笠原晴香 (Sound! Euphonium)
This is the dataset of Ogasawara Haruka/小笠原晴香 (Sound! Euphonium), containing 360 images and their tags.
The core tags of this character are `brown_hair, twintails, low_twintails, brown_eyes, black_hair, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 360 | 190.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ogasawara_haruka_soundeuphonium/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 360 | 190.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ogasawara_haruka_soundeuphonium/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 605 | 311.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ogasawara_haruka_soundeuphonium/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ogasawara_haruka_soundeuphonium',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, blush, kitauji_high_school_uniform, serafuku, solo, white_sailor_collar, brown_shirt, closed_mouth, green_neckerchief, upper_body, smile, chalkboard, looking_at_viewer, long_sleeves |
| 1 | 5 |  |  |  |  |  | 1girl, blush, brown_shirt, brown_skirt, chalkboard, green_neckerchief, kitauji_high_school_uniform, long_sleeves, serafuku, solo, white_sailor_collar, closed_mouth, hands_up, looking_at_viewer, open_mouth, pleated_skirt, smile |
| 2 | 7 |  |  |  |  |  | 1girl, brown_shirt, brown_skirt, green_neckerchief, holding_instrument, kitauji_high_school_uniform, long_sleeves, pleated_skirt, serafuku, solo, standing, white_sailor_collar, smile, blush, indoors, open_mouth, white_socks, looking_at_viewer |
| 3 | 7 |  |  |  |  |  | 1girl, brown_serafuku, brown_shirt, brown_skirt, green_neckerchief, kitauji_high_school_uniform, long_sleeves, pleated_skirt, solo, white_sailor_collar, looking_at_viewer, parted_bangs, closed_mouth, smile, blush, indoors, outdoors, window |
| 4 | 13 |  |  |  |  |  | 1girl, blue_sailor_collar, blush, green_neckerchief, kitauji_high_school_uniform, serafuku, solo, upper_body, indoors, white_shirt, anime_coloring, closed_mouth, short_sleeves, looking_at_viewer |
| 5 | 27 |  |  |  |  |  | 1girl, blue_skirt, kitauji_high_school_uniform, pleated_skirt, serafuku, short_sleeves, blue_sailor_collar, green_neckerchief, white_shirt, blush, solo, indoors, standing, closed_mouth, classroom, chalkboard, wristwatch, looking_at_viewer, ponytail |
| 6 | 6 |  |  |  |  |  | 2girls, blue_sailor_collar, blush, green_neckerchief, kitauji_high_school_uniform, serafuku, short_sleeves, white_shirt, blue_skirt, closed_mouth, solo_focus, blurry, indoors, long_hair, chalkboard, classroom |
| 7 | 7 |  |  |  |  |  | 1girl, blush, kitauji_high_school_uniform, serafuku, solo, anime_coloring, brown_shirt, indoors, looking_at_viewer, open_mouth, white_sailor_collar, parted_bangs, portrait, tears |
| 8 | 6 |  |  |  |  |  | 1girl, blush, serafuku, solo, closed_mouth, kitauji_high_school_uniform, sailor_collar |
| 9 | 5 |  |  |  |  |  | 3girls, green_hair, green_neckerchief, kitauji_high_school_uniform, kneehighs, long_sleeves, standing, white_sailor_collar, white_socks, brown_shirt, brown_skirt, indoors, long_hair, pleated_skirt, short_hair, brown_serafuku, closed_mouth, looking_at_another, smile, chalkboard, classroom, hallway, solo_focus |
| 10 | 5 |  |  |  |  |  | 1girl, curtains, indoors, solo, window, blush, collarbone, open_mouth, rain, pink_shirt |
| 11 | 6 |  |  |  |  |  | 1girl, blush, closed_mouth, collarbone, solo, indoors, parted_bangs, pink_shirt, curtains, jacket, short_sleeves, window, hoodie, looking_at_viewer, looking_down, upper_body |
| 12 | 6 |  |  |  |  |  | blurry, blush, multiple_girls, 1girl, playing_instrument, solo_focus, aqua_shirt, sweatdrop |
| 13 | 9 |  |  |  |  |  | blurry, head_out_of_frame, 1boy, closed_mouth, solo, 1girl, male_focus, black_background, white_shirt, kitauji_high_school_uniform, short_hair, upper_body |
| 14 | 8 |  |  |  |  |  | 1girl, indoors, solo, ponytail, short_sleeves, white_socks, hood_down, hoodie, sitting, from_side, open_mouth, blue_shorts, cup, food, plant, profile, bed, full_body, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | kitauji_high_school_uniform | serafuku | solo | white_sailor_collar | brown_shirt | closed_mouth | green_neckerchief | upper_body | smile | chalkboard | looking_at_viewer | long_sleeves | brown_skirt | hands_up | open_mouth | pleated_skirt | holding_instrument | standing | indoors | white_socks | brown_serafuku | parted_bangs | outdoors | window | blue_sailor_collar | white_shirt | anime_coloring | short_sleeves | blue_skirt | classroom | wristwatch | ponytail | 2girls | solo_focus | blurry | long_hair | portrait | tears | sailor_collar | 3girls | green_hair | kneehighs | short_hair | looking_at_another | hallway | curtains | collarbone | rain | pink_shirt | jacket | hoodie | looking_down | multiple_girls | playing_instrument | aqua_shirt | sweatdrop | head_out_of_frame | 1boy | male_focus | black_background | hood_down | sitting | from_side | blue_shorts | cup | food | plant | profile | bed | full_body | holding |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:------------------------------|:-----------|:-------|:----------------------|:--------------|:---------------|:--------------------|:-------------|:--------|:-------------|:--------------------|:---------------|:--------------|:-----------|:-------------|:----------------|:---------------------|:-----------|:----------|:--------------|:-----------------|:---------------|:-----------|:---------|:---------------------|:--------------|:-----------------|:----------------|:-------------|:------------|:-------------|:-----------|:---------|:-------------|:---------|:------------|:-----------|:--------|:----------------|:---------|:-------------|:------------|:-------------|:---------------------|:----------|:-----------|:-------------|:-------|:-------------|:---------|:---------|:---------------|:-----------------|:---------------------|:-------------|:------------|:--------------------|:-------|:-------------|:-------------------|:------------|:----------|:------------|:--------------|:------|:-------|:--------|:----------|:------|:------------|:----------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | | X | X | X | X | X | | X | | X | X | X | | | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | X | X | X | | | X | X | X | | | X | | | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 27 |  |  |  |  |  | X | X | X | X | X | | | X | X | | | X | X | | | | | X | | X | X | | | | | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | | X | X | X | | | | X | X | | | X | | | | | | | | | X | | | | | | X | X | | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | X | | | | X | | | | X | | | X | | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | | | X | | | X | X | X | X | | X | X | | X | X | | | X | | X | X | X | X | | | | | | | | | X | | | | X | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | | X | | | | | | | | | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | X | | | X | | | X | | X | | | X | | | | | | | | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 12 | 6 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | |
| 13 | 9 |  |  |  |  |  | X | | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | |
| 14 | 8 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | X | | | | X | X | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
xiaozhou0822/dfdsfsdf | ---
license: mit
---
|
Seanxh/twitter_dataset_1713219964 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 207222
num_examples: 484
download_size: 70275
dataset_size: 207222
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/reisen_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of reisen/レイセン (Touhou)
This is the dataset of reisen/レイセン (Touhou), containing 227 images and their tags.
The core tags of this character are `animal_ears, rabbit_ears, short_hair, red_eyes, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 227 | 183.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 227 | 125.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 463 | 249.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 227 | 167.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 463 | 321.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/reisen_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, long_sleeves, red_necktie, solo, collared_shirt, white_shirt, black_jacket, rifle, pleated_skirt, looking_at_viewer, standing, bangs, pink_skirt, blazer, holding_gun, crescent_pin, open_mouth, smile, blush, hair_between_eyes, buttons, one-hour_drawing_challenge, simple_background |
| 1 | 10 |  |  |  |  |  | 1girl, collared_shirt, long_sleeves, red_necktie, solo, white_shirt, blazer, pleated_skirt, looking_at_viewer, white_background, simple_background, cowboy_shot, open_mouth, rabbit_girl, rabbit_tail, black_jacket, crescent_pin, pink_skirt, bangs, closed_mouth, floppy_ears |
| 2 | 8 |  |  |  |  |  | 1girl, blazer, necktie, purple_hair, skirt, solo, rabbit_tail, open_mouth |
| 3 | 18 |  |  |  |  |  | 1girl, solo, blazer, necktie, skirt, black_thighhighs, smile, zettai_ryouiki, open_mouth |
| 4 | 7 |  |  |  |  |  | 1girl, solo, bat_wings, dress, looking_at_viewer, short_sleeves, smile, wrist_cuffs, mob_cap, multiple_girls, open_mouth, puffy_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | red_necktie | solo | collared_shirt | white_shirt | black_jacket | rifle | pleated_skirt | looking_at_viewer | standing | bangs | pink_skirt | blazer | holding_gun | crescent_pin | open_mouth | smile | blush | hair_between_eyes | buttons | one-hour_drawing_challenge | simple_background | white_background | cowboy_shot | rabbit_girl | rabbit_tail | closed_mouth | floppy_ears | necktie | purple_hair | skirt | black_thighhighs | zettai_ryouiki | bat_wings | dress | short_sleeves | wrist_cuffs | mob_cap | multiple_girls | puffy_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------|:-------|:-----------------|:--------------|:---------------|:--------|:----------------|:--------------------|:-----------|:--------|:-------------|:---------|:--------------|:---------------|:-------------|:--------|:--------|:--------------------|:----------|:-----------------------------|:--------------------|:-------------------|:--------------|:--------------|:--------------|:---------------|:--------------|:----------|:--------------|:--------|:-------------------|:-----------------|:------------|:--------|:----------------|:--------------|:----------|:-----------------|:----------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | X | X | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | X | | | | | | | | | | X | | | X | | | | | | | | | | X | | | X | X | X | | | | | | | | | |
| 3 | 18 |  |  |  |  |  | X | | | X | | | | | | | | | | X | | | X | X | | | | | | | | | | | | X | | X | X | X | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
LoGic142/vit_training | ---
license: mit
---
|
ebony59/chaiverse_lora_testing_fandom_IO | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 149058.0
num_examples: 100
download_size: 96520
dataset_size: 149058.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chaiverse_lora_testing_IO"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Librico/llmao-data | ---
license: apache-2.0
---
|
heliosprime/twitter_dataset_1713218753 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 31794
num_examples: 87
download_size: 25377
dataset_size: 31794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713218753"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
napatswift/thvl_text_recognition | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 734936377.786
num_examples: 363922
download_size: 686919103
dataset_size: 734936377.786
---
# Dataset Card for "thvl_text_recognition"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_29_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 19755254
num_examples: 14935
download_size: 10291371
dataset_size: 19755254
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_29_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
automated-research-group/llama2_7b_chat-arc_challenge-results | ---
dataset_info:
config_name: '{''do_sample''=False, ''beams''=1}'
features:
- name: id
dtype: string
- name: prediction
dtype: string
- name: arc_challenge_accuracy
dtype: bool
splits:
- name: train
num_bytes: 77311
num_examples: 299
download_size: 43636
dataset_size: 77311
configs:
- config_name: '{''do_sample''=False, ''beams''=1}'
data_files:
- split: train
path: '{''do_sample''=False, ''beams''=1}/train-*'
---
|
jeffnyman/emotions | ---
pretty_name: Emotions
license: cc-by-sa-4.0
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- multi-class-classification
tags:
- emotion-classification
dataset_info:
- config_name: split
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
"0": sadness
"1": joy
"2": love
"3": anger
"4": fear
"5": surprise
splits:
- name: train
num_bytes: 1741597
num_examples: 16000
- name: validation
num_bytes: 214703
num_examples: 2000
- name: test
num_bytes: 217181
num_examples: 2000
download_size: 740883
dataset_size: 2173481
- config_name: unsplit
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
"0": sadness
"1": joy
"2": love
"3": anger
"4": fear
"5": surprise
splits:
- name: train
num_bytes: 45445685
num_examples: 416809
download_size: 15388281
dataset_size: 45445685
train-eval-index:
- config: default
task: text-classification
task_id: multi_class_classification
splits:
train_split: train
eval_split: test
col_mapping:
text: text
label: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
# Dataset Card for "emotions"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Paper:** [CARER: Contextualized Affect Representations for Emotion Recognition](https://aclanthology.org/D18-1404/)
- **Size of downloaded dataset files:** 16.13 MB
- **Size of the generated dataset:** 47.62 MB
- **Total amount of disk used:** 63.75 MB
### Dataset Summary
Emotions is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise. For more detailed information please refer to the paper. Note that the paper does contain a larger data set with eight emotions being considered.
## Dataset Structure
### Data Instances
An example bit of data looks like this:
```
{
"text": "im feeling quite sad and sorry for myself but ill snap out of it soon",
"label": 0
}
```
### Data Fields
The data fields are:
- `text`: a `string` feature.
- `label`: a classification label, with possible values including `sadness` (0), `joy` (1), `love` (2), `anger` (3), `fear` (4), `surprise` (5).
### Data Splits
The dataset has two configurations.
- split: with a total of 20,000 examples split into train, validation and test.
- unsplit: with a total of 416,809 examples in a single train split.
| name | train | validation | test |
| ------- | -----: | ---------: | ---: |
| split | 16000 | 2000 | 2000 |
| unsplit | 416809 | n/a | n/a |
## Additional Information
### Licensing Information
The dataset should be used for educational and research purposes only. It is licensed under Attribution-ShareAlike 4.0 International (CC BY-SA 4.0).
### Citation Information
If you use this dataset, please cite:
```
@inproceedings{saravia-etal-2018-carer,
title = "{CARER}: Contextualized Affect Representations for Emotion Recognition",
author = "Saravia, Elvis and
Liu, Hsien-Chi Toby and
Huang, Yen-Hao and
Wu, Junlin and
Chen, Yi-Shin",
booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
month = oct # "-" # nov,
year = "2018",
address = "Brussels, Belgium",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D18-1404",
doi = "10.18653/v1/D18-1404",
pages = "3687--3697",
abstract = "Emotions are expressed in nuanced ways, which varies by collective or individual experiences, knowledge, and beliefs. Therefore, to understand emotion, as conveyed through text, a robust mechanism capable of capturing and modeling different linguistic nuances and phenomena is needed. We propose a semi-supervised, graph-based algorithm to produce rich structural descriptors which serve as the building blocks for constructing contextualized affect representations from text. The pattern-based representations are further enriched with word embeddings and evaluated through several emotion recognition tasks. Our experimental results demonstrate that the proposed method outperforms state-of-the-art techniques on emotion recognition tasks.",
}
```
|
gokuls/wiki_book_corpus_complete_raw_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24500165181
num_examples: 80462898
download_size: 14400389437
dataset_size: 24500165181
---
# Dataset Card for "wiki_book_corpus_complete_raw_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tamnemtf/hcmue-teacher-qa | ---
dataset_info:
features:
- name: concept
dtype: string
- name: description
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 178427
num_examples: 445
download_size: 74274
dataset_size: 178427
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
stoddur/med_chat_16_moved_ds_3x | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 834787800.0
num_examples: 271035
download_size: 11362639
dataset_size: 834787800.0
---
# Dataset Card for "med_chat_16_moved_ds_3x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmqg/qag_esquad | ---
license: cc-by-sa-4.0
pretty_name: SQuAD for question generation
language: es
multilinguality: monolingual
size_categories: 1k<n<10K
source_datasets: lmqg/qg_esquad
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qag_esquad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is the question & answer generation dataset based on the ESQuAD.
### Supported Tasks and Leaderboards
* `question-answer-generation`: The dataset is assumed to be used to train a model for question & answer generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
Spanish (es)
## Dataset Structure
An example of 'train' looks as follows.
```
{
"paragraph": ""4 Minutes" fue lanzado como el primer sencillo del álbum y alcanzó el número tres en el Billboard Hot 100. Fue el 37º hit top-ten de Madonna en la lista, empujando a Madonna más allá de Elvis Presley como el artista con más éxitos entre los diez primeros. En el Reino Unido mantuvo su récord de más sencillos número uno para una artista femenina; "4 Minutes" se convierte en su decimotercera. En el 23 Japan Gold Disc Awards, Madonna recibió su quinto trofeo de Artista del Año de la Recording Industry Association of Japan, la mayor cantidad para cualquier artista. Para promover aún más el álbum, Madonna se embarcó en el Sticky & Sweet Tour; Su primera gran empresa con Live Nation. Con una recaudación de $280 millones, se convirtió en la gira más taquillera de un artista en solitario entonces, superando el récord anterior que Madonna estableció con la gira Confessions Tour; Más tarde fue superado por The Wall Live de Roger Waters. Se amplió al año siguiente, añadiendo nuevas fechas europeas, y después de que terminó, la recaudación total fue de $408 millones.",
"questions": [ "¿Cuál es el nombre de la primera gira con Live Nation?", "4 minutos se convirtió en la canción número uno de Madonna en el Reino Unido.", "¿Cuál sencillo fue lanzado como el primer sencillo del álbum?", "¿Cuánto recaudaron Stick y Sweet Tour?", "Madonna superó a qué artista con más éxitos entre los diez primeros." ],
"answers": [ "Sticky & Sweet Tour", "decimotercera", "\"4 Minute", "$280 millones,", "Elvis Presley" ]
"questions_answers": "question: ¿Cuál es el nombre de la primera gira con Live Nation?, answer: Sticky & Sweet Tour | question: 4 minutos se convirtió en la canción número uno de Madonna en el Reino Unido., answer: decimotercera | question: ¿Cuál sencillo fue lanzado como el primer sencillo del álbum?, answer: "4 Minute | question: ¿Cuánto recaudaron Stick y Sweet Tour?, answer: $280 millones, | question: Madonna superó a qué artista con más éxitos entre los diez primeros., answer: Elvis Presley"
}
```
The data fields are the same among all splits.
- `questions`: a `list` of `string` features.
- `answers`: a `list` of `string` features.
- `paragraph`: a `string` feature.
- `questions_answers`: a `string` feature.
## Data Splits
|train|validation|test |
|----:|---------:|----:|
|18829| 2067 | 8234|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
wanadzhar913/wikipedia-malaysian-road-sign-images | ---
license: apache-2.0
---
# TLDR
* wikipedia page: [Road signs in Malaysia](https://en.wikipedia.org/wiki/Road_signs_in_Malaysia)
* num. of images: 365
* contributed to: https://github.com/orgs/malaysia-ai/projects/9/views/1?pane=issue&itemId=43619647
* date scraped: 14th January 2024 |
Chunt0/anoel-11-26 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 19715096.0
num_examples: 78
download_size: 19715409
dataset_size: 19715096.0
---
# Dataset Card for "anoel-11-26"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
florentgbelidji/pubmed-running | ---
license: openrail
dataset_info:
features:
- name: article_id
dtype: string
- name: article
dtype: string
- name: abstract
dtype: string
- name: section_names
dtype: string
splits:
- name: train
num_bytes: 136252251
num_examples: 5153
download_size: 62923279
dataset_size: 136252251
---
|
open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1 | ---
pretty_name: Evaluation run of WizardLM/WizardLM-13B-V1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WizardLM/WizardLM-13B-V1.1](https://huggingface.co/WizardLM/WizardLM-13B-V1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T23:24:10.120000](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1/blob/main/results_2023-10-28T23-24-10.120000.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24213506711409397,\n\
\ \"em_stderr\": 0.004386967355552305,\n \"f1\": 0.3075335570469809,\n\
\ \"f1_stderr\": 0.0043568221171957165,\n \"acc\": 0.41585700582764323,\n\
\ \"acc_stderr\": 0.00984029249742667\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.24213506711409397,\n \"em_stderr\": 0.004386967355552305,\n\
\ \"f1\": 0.3075335570469809,\n \"f1_stderr\": 0.0043568221171957165\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08112206216830932,\n \
\ \"acc_stderr\": 0.007520395797922653\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930687\n\
\ }\n}\n```"
repo_url: https://huggingface.co/WizardLM/WizardLM-13B-V1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|arc:challenge|25_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T15_36_34.719946
path:
- '**/details_harness|drop|3_2023-10-18T15-36-34.719946.parquet'
- split: 2023_10_28T23_24_10.120000
path:
- '**/details_harness|drop|3_2023-10-28T23-24-10.120000.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T23-24-10.120000.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T15_36_34.719946
path:
- '**/details_harness|gsm8k|5_2023-10-18T15-36-34.719946.parquet'
- split: 2023_10_28T23_24_10.120000
path:
- '**/details_harness|gsm8k|5_2023-10-28T23-24-10.120000.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T23-24-10.120000.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hellaswag|10_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-26T14:16:17.821348.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-26T14:16:17.821348.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T15_36_34.719946
path:
- '**/details_harness|winogrande|5_2023-10-18T15-36-34.719946.parquet'
- split: 2023_10_28T23_24_10.120000
path:
- '**/details_harness|winogrande|5_2023-10-28T23-24-10.120000.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T23-24-10.120000.parquet'
- config_name: results
data_files:
- split: 2023_07_26T14_16_17.821348
path:
- results_2023-07-26T14:16:17.821348.parquet
- split: 2023_10_18T15_36_34.719946
path:
- results_2023-10-18T15-36-34.719946.parquet
- split: 2023_10_28T23_24_10.120000
path:
- results_2023-10-28T23-24-10.120000.parquet
- split: latest
path:
- results_2023-10-28T23-24-10.120000.parquet
---
# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardLM-13B-V1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-13B-V1.1](https://huggingface.co/WizardLM/WizardLM-13B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T23:24:10.120000](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1/blob/main/results_2023-10-28T23-24-10.120000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24213506711409397,
"em_stderr": 0.004386967355552305,
"f1": 0.3075335570469809,
"f1_stderr": 0.0043568221171957165,
"acc": 0.41585700582764323,
"acc_stderr": 0.00984029249742667
},
"harness|drop|3": {
"em": 0.24213506711409397,
"em_stderr": 0.004386967355552305,
"f1": 0.3075335570469809,
"f1_stderr": 0.0043568221171957165
},
"harness|gsm8k|5": {
"acc": 0.08112206216830932,
"acc_stderr": 0.007520395797922653
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.012160189196930687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Yanbin99/GITQA-Aug-Pruned | ---
license: mit
---
|
autoevaluate/autoeval-staging-eval-project-xsum-f0ba0c18-12915726 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: sshleifer/distilbart-xsum-12-6
metrics: ['bleu']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: sshleifer/distilbart-xsum-12-6
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.