datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
wenhanhan/HEALTHVER_test | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1108776
num_examples: 1823
download_size: 339472
dataset_size: 1108776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "HEALTHVER_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kjappelbaum/pervoskite_db | ---
license: cc-by-4.0
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_60 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1309378480.0
num_examples: 255140
download_size: 1335141507
dataset_size: 1309378480.0
---
# Dataset Card for "chunk_60"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chezhian/Tweet_summary | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1069414
num_examples: 800
download_size: 567110
dataset_size: 1069414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
coseal/CodeUltraFeedback | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: preference
dtype: string
- name: models
sequence: string
- name: responses
list:
- name: model
dtype: string
- name: response
dtype: string
- name: annotations
list:
- name: model
dtype: string
- name: rating
dtype: string
- name: rationale
dtype: string
splits:
- name: train
num_bytes: 92525565
num_examples: 10000
download_size: 38154440
dataset_size: 92525565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
tags:
- AI feedback
- LLM-as-a-Judge
- code generation
- preference dataset
- coding preferences
size_categories:
- 10K<n<100K
--- |
mrm8488/en_es_results_good | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 1473
num_examples: 20
download_size: 2789
dataset_size: 1473
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Pablao0948/Naldo_Benny | ---
license: openrail
---
|
sradc/chunked-shuffled-wikipedia20220301en-bookcorpusopen | ---
language: en
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 26076989556
num_examples: 33536113
download_size: 17380043798
dataset_size: 26076989556
---
# Dataset Card for "wikipedia20220301en-bookcorpusopen-chunked-shuffled"
```
num_examples: 33.5 million
download_size: 15.3 GB
dataset_size: 26.1 GB
```
This dataset combines [wikipedia20220301.en](https://huggingface.co/datasets/wikipedia) and [bookcorpusopen](https://huggingface.co/datasets/bookcorpusopen),
and splits the data into smaller chunks, of size ~820 chars
(such that each item will be at least ~128 tokens for the average tokenizer).
The order of the items in this dataset has been shuffled,
meaning you don't have to use `dataset.shuffle`,
which is slower to iterate over.
The logic only splits on spaces, so the chunks are likely to be slightly larger than 820 chars.
The dataset has been normalized into lower case, with accents and non-english characters removed.
Items with less than 200 chars or more than 1000 chars have been removed.
This dataset is processed for convenience, at the expense of losing some percentage of the tokens due to truncation,
(assuming the training minibatches are truncated to 128 tokens).
|
LiveEvil/MyClass | ---
license: mit
---
|
proserve/FedML_PubMedQA_instruction_stf_dataset | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 545857127
num_examples: 272518
download_size: 266576180
dataset_size: 545857127
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/gov2 | ---
pretty_name: '`gov2`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `gov2`
The `gov2` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/gov2#gov2).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=25,205,179
This dataset is used by: [`gov2_trec-tb-2004`](https://huggingface.co/datasets/irds/gov2_trec-tb-2004), [`gov2_trec-tb-2005`](https://huggingface.co/datasets/irds/gov2_trec-tb-2005), [`gov2_trec-tb-2005_efficiency`](https://huggingface.co/datasets/irds/gov2_trec-tb-2005_efficiency), [`gov2_trec-tb-2005_named-page`](https://huggingface.co/datasets/irds/gov2_trec-tb-2005_named-page), [`gov2_trec-tb-2006`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006), [`gov2_trec-tb-2006_efficiency`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006_efficiency), [`gov2_trec-tb-2006_efficiency_10k`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006_efficiency_10k), [`gov2_trec-tb-2006_efficiency_stream1`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006_efficiency_stream1), [`gov2_trec-tb-2006_efficiency_stream2`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006_efficiency_stream2), [`gov2_trec-tb-2006_efficiency_stream3`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006_efficiency_stream3), [`gov2_trec-tb-2006_efficiency_stream4`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006_efficiency_stream4), [`gov2_trec-tb-2006_named-page`](https://huggingface.co/datasets/irds/gov2_trec-tb-2006_named-page)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/gov2', 'docs')
for record in docs:
record # {'doc_id': ..., 'url': ..., 'http_headers': ..., 'body': ..., 'body_content_type': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
k0ntra/tehranen | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
splits:
- name: train
num_bytes: 1105920
num_examples: 360
download_size: 1749452
dataset_size: 1105920
---
# Dataset Card for "tehranen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ | ---
pretty_name: Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T21:00:02.304492](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ/blob/main/results_2023-10-28T21-00-02.304492.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22158137583892618,\n\
\ \"em_stderr\": 0.004253171428083824,\n \"f1\": 0.28616296140939684,\n\
\ \"f1_stderr\": 0.004276937020149761,\n \"acc\": 0.3751559533333772,\n\
\ \"acc_stderr\": 0.007270592555507228\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.22158137583892618,\n \"em_stderr\": 0.004253171428083824,\n\
\ \"f1\": 0.28616296140939684,\n \"f1_stderr\": 0.004276937020149761\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.0022675371022544783\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759979\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T21_00_02.304492
path:
- '**/details_harness|drop|3_2023-10-28T21-00-02.304492.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T21-00-02.304492.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T21_00_02.304492
path:
- '**/details_harness|gsm8k|5_2023-10-28T21-00-02.304492.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T21-00-02.304492.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-08.880546.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-32-08.880546.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T21_00_02.304492
path:
- '**/details_harness|winogrande|5_2023-10-28T21-00-02.304492.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T21-00-02.304492.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_32_08.880546
path:
- results_2023-09-11T17-32-08.880546.parquet
- split: 2023_10_28T21_00_02.304492
path:
- results_2023-10-28T21-00-02.304492.parquet
- split: latest
path:
- results_2023-10-28T21-00-02.304492.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T21:00:02.304492](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-GPTQ/blob/main/results_2023-10-28T21-00-02.304492.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.22158137583892618,
"em_stderr": 0.004253171428083824,
"f1": 0.28616296140939684,
"f1_stderr": 0.004276937020149761,
"acc": 0.3751559533333772,
"acc_stderr": 0.007270592555507228
},
"harness|drop|3": {
"em": 0.22158137583892618,
"em_stderr": 0.004253171428083824,
"f1": 0.28616296140939684,
"f1_stderr": 0.004276937020149761
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544783
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759979
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
marcus2000/hse_spam_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 359224.64467005077
num_examples: 531
- name: test
num_bytes: 40590.35532994924
num_examples: 60
download_size: 216639
dataset_size: 399815.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
easygraph-bench/cheminformatics | ---
size_categories:
- n<1K
--- |
Vidyuth/marian-finetuned-kde4-en-to-fr | ---
license: apache-2.0
tags:
- translation
- generated_from_trainer
datasets:
- kde4
metrics:
- bleu
model-index:
- name: test-marian-finetuned-kde4-en-to-fr
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: kde4
type: kde4
args: en-fr
metrics:
- name: Bleu
type: bleu
value: 52.94161337775576
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test-marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8559
- Bleu: 52.9416
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.8.1+cu111
- Datasets 1.12.2.dev0
- Tokenizers 0.10.3
|
CyberHarem/guilty_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of guilty/ギルティ/吉尔提/길티 (Nikke: Goddess of Victory)
This is the dataset of guilty/ギルティ/吉尔提/길티 (Nikke: Goddess of Victory), containing 58 images and their tags.
The core tags of this character are `breasts, long_hair, green_hair, multicolored_hair, purple_eyes, hair_between_eyes, black_hair, bangs, huge_breasts, two-tone_hair, streaked_hair, very_long_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 58 | 101.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guilty_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 58 | 50.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guilty_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 153 | 116.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guilty_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 58 | 86.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guilty_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 153 | 176.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guilty_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/guilty_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 35 |  |  |  |  |  | 1girl, solo, looking_at_viewer, bodysuit, blush, chain, closed_mouth, simple_background, brown_hair, gloves, straitjacket |
| 1 | 5 |  |  |  |  |  | 1girl, day, looking_at_viewer, outdoors, solo, building, city, standing, street, cloud, cowboy_shot, curvy, parted_lips, pink_eyes, thick_thighs, arms_behind_head, arms_up, ass, blue_sky, cameltoe, car, covered_navel, from_behind, ground_vehicle, looking_back, skin_tight, thigh_gap, white_bodysuit |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | bodysuit | blush | chain | closed_mouth | simple_background | brown_hair | gloves | straitjacket | day | outdoors | building | city | standing | street | cloud | cowboy_shot | curvy | parted_lips | pink_eyes | thick_thighs | arms_behind_head | arms_up | ass | blue_sky | cameltoe | car | covered_navel | from_behind | ground_vehicle | looking_back | skin_tight | thigh_gap | white_bodysuit |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:--------|:--------|:---------------|:--------------------|:-------------|:---------|:---------------|:------|:-----------|:-----------|:-------|:-----------|:---------|:--------|:--------------|:--------|:--------------|:------------|:---------------|:-------------------|:----------|:------|:-----------|:-----------|:------|:----------------|:--------------|:-----------------|:---------------|:-------------|:------------|:-----------------|
| 0 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
rinme/VoiceDatasets | ---
license: mit
---
|
CyberHarem/surcouf_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of surcouf/シュルクーフ/絮库夫 (Azur Lane)
This is the dataset of surcouf/シュルクーフ/絮库夫 (Azur Lane), containing 157 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, green_eyes, red_hair, bangs, hat, straw_hat, sun_hat, hair_between_eyes, sunglasses, pink_hair, tinted_eyewear, orange-tinted_eyewear, very_long_hair, looking_over_eyewear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 157 | 280.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/surcouf_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 157 | 134.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/surcouf_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 409 | 305.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/surcouf_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 157 | 234.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/surcouf_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 409 | 463.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/surcouf_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/surcouf_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, bare_shoulders, blush, looking_at_viewer, sideboob, white_gloves, elbow_gloves, open_mouth, simple_background, solo, white_background, hair_ornament, ahoge, armpits, leotard, arm_up, one_side_up, :d, skin_fang, thighhighs, upper_body |
| 1 | 11 |  |  |  |  |  | 1girl, cleavage, day, hat_flower, looking_at_viewer, outdoors, solo, white_bikini, blue_sky, cloud, ocean, bare_shoulders, water, beach, blush, smile, navel, sitting, collarbone, wet |
| 2 | 39 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_bikini, hat_flower, blush, cleavage, bare_shoulders, wet, thighs, navel, in_water, on_side, mouth_hold, ass, collarbone, pink-tinted_eyewear |
| 3 | 5 |  |  |  |  |  | 1girl, cleavage, hat_flower, looking_at_viewer, simple_background, solo, white_background, white_bikini, navel, collarbone, adjusting_eyewear, blush, open_mouth, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | looking_at_viewer | sideboob | white_gloves | elbow_gloves | open_mouth | simple_background | solo | white_background | hair_ornament | ahoge | armpits | leotard | arm_up | one_side_up | :d | skin_fang | thighhighs | upper_body | cleavage | day | hat_flower | outdoors | white_bikini | blue_sky | cloud | ocean | water | beach | smile | navel | sitting | collarbone | wet | thighs | in_water | on_side | mouth_hold | ass | pink-tinted_eyewear | adjusting_eyewear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:--------------------|:-----------|:---------------|:---------------|:-------------|:--------------------|:-------|:-------------------|:----------------|:--------|:----------|:----------|:---------|:--------------|:-----|:------------|:-------------|:-------------|:-----------|:------|:-------------|:-----------|:---------------|:-----------|:--------|:--------|:--------|:--------|:--------|:--------|:----------|:-------------|:------|:---------|:-----------|:----------|:-------------|:------|:----------------------|:--------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 39 |  |  |  |  |  | X | X | X | X | | | | | | X | | | | | | | | | | | | X | | X | | X | | | | | | | X | | X | X | X | X | X | X | X | X | |
| 3 | 5 |  |  |  |  |  | X | | X | X | | | | X | X | X | X | | | | | | | | | | | X | | X | | X | | | | | | | X | X | X | | | | | | | | X |
|
medmabfc/Arabic_News_Texts_Corpus | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 655293
num_examples: 154
download_size: 309603
dataset_size: 655293
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DSSGxMunich/nrw-bplan-pdfs | ---
license: mit
---
This dataset contains zips of all pdf files which were downloaded from the [NRW Geoportal](https://www.geoportal.nrw/?activetab=portal). The pdfs filenames and document ids can be linked back to the [document_text](https://huggingface.co/datasets/DSSGxMunich/document_text) table. |
Nexdata/German_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/German_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/949?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
German audio data captured by mobile phone, 1,796 hours in total, recorded by 3,442 German native speakers. The recorded text is designed by linguistic experts, covering generic, interactive, on-board, home and other categories. The text has been proofread manually with high accuracy; this data can be used for automatic speech recognition, machine translation, and voiceprint recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/949?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
German
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
316usman/thematic4c_rr | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 101174683.2760672
num_examples: 158413
download_size: 35845623
dataset_size: 101174683.2760672
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sovitrath/couch_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
splits:
- name: train
num_bytes: 97948166.0
num_examples: 70
- name: test
num_bytes: 18067627.0
num_examples: 13
download_size: 115763941
dataset_size: 116015793.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
sudheesh/reuters_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073411
dataset_size: 17042155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
NickM2002/zzzterrible | ---
license: apache-2.0
---
|
CodeTheory/demo | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for demo
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("CodeTheory/demo")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("CodeTheory/demo")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| instruction | 指令 | text | True | False |
| input | 输入 | text | True | False |
| output | 输出 | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| question-rating | 对提问进行评分 | rating | False | N/A | [1, 2, 3, 4, 5, 6, 7, 8, 9] |
| context-rating | 对回复进行评分 | rating | True | N/A | [1, 2, 3, 4, 5, 6, 7, 8, 9] |
| preference | 哪个回复最好?按照从好到坏的顺序进行排序 | ranking | True | N/A | ['A', 'B', 'C'] |
| suggestion | 建议 | text | True | N/A | N/A |
| topics | 选择提问的主题 | multi_label_selection | True | N/A | ['数学问题', '逻辑问题', '地理知识', '文学知识', '居家知识', '行业知识'] |
| sentiment | 对话的感情 | label_selection | True | N/A | ['积极', '中立', '消极'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"input": "",
"instruction": "\u4fdd\u6301\u5065\u5eb7\u7684\u4e09\u4e2a\u63d0\u793a\u3002",
"output": "\u4ee5\u4e0b\u662f\u4fdd\u6301\u5065\u5eb7\u7684\u4e09\u4e2a\u63d0\u793a\uff1a\n\n1. \u4fdd\u6301\u8eab\u4f53\u6d3b\u52a8\u3002\u6bcf\u5929\u505a\u9002\u5f53\u7684\u8eab\u4f53\u8fd0\u52a8\uff0c\u5982\u6563\u6b65\u3001\u8dd1\u6b65\u6216\u6e38\u6cf3\uff0c\u80fd\u4fc3\u8fdb\u5fc3\u8840\u7ba1\u5065\u5eb7\uff0c\u589e\u5f3a\u808c\u8089\u529b\u91cf\uff0c\u5e76\u6709\u52a9\u4e8e\u51cf\u5c11\u4f53\u91cd\u3002\n\n2. \u5747\u8861\u996e\u98df\u3002\u6bcf\u5929\u98df\u7528\u65b0\u9c9c\u7684\u852c\u83dc\u3001\u6c34\u679c\u3001\u5168\u8c37\u7269\u548c\u8102\u80aa\u542b\u91cf\u4f4e\u7684\u86cb\u767d\u8d28\u98df\u7269\uff0c\u907f\u514d\u9ad8\u7cd6\u3001\u9ad8\u8102\u80aa\u548c\u52a0\u5de5\u98df\u54c1\uff0c\u4ee5\u4fdd\u6301\u5065\u5eb7\u7684\u996e\u98df\u4e60\u60ef\u3002\n\n3. \u7761\u7720\u5145\u8db3\u3002\u7761\u7720\u5bf9\u4eba\u4f53\u5065\u5eb7\u81f3\u5173\u91cd\u8981\uff0c\u6210\u5e74\u4eba\u6bcf\u5929\u5e94\u4fdd\u8bc1 7-8 \u5c0f\u65f6\u7684\u7761\u7720\u3002\u826f\u597d\u7684\u7761\u7720\u6709\u52a9\u4e8e\u51cf\u8f7b\u538b\u529b\uff0c\u4fc3\u8fdb\u8eab\u4f53\u6062\u590d\uff0c\u5e76\u63d0\u9ad8\u6ce8\u610f\u529b\u548c\u8bb0\u5fc6\u529b\u3002"
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "c658ddde-2d39-43ce-b478-633a1d19d2c7",
"values": {
"context-rating": {
"value": 5
},
"preference": {
"value": [
{
"rank": 2,
"value": "A"
},
{
"rank": 3,
"value": "B"
},
{
"rank": 1,
"value": "C"
}
]
},
"question-rating": {
"value": 4
},
"sentiment": {
"value": "\u4e2d\u7acb"
},
"suggestion": {
"value": "111"
},
"topics": {
"value": [
"\u903b\u8f91\u95ee\u9898",
"\u5730\u7406\u77e5\u8bc6"
]
}
}
}
],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"context-rating": [
{
"status": "submitted",
"user_id": "c658ddde-2d39-43ce-b478-633a1d19d2c7",
"value": 5
}
],
"context-rating-suggestion": null,
"context-rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"input": "",
"instruction": "\u4fdd\u6301\u5065\u5eb7\u7684\u4e09\u4e2a\u63d0\u793a\u3002",
"metadata": "{}",
"output": "\u4ee5\u4e0b\u662f\u4fdd\u6301\u5065\u5eb7\u7684\u4e09\u4e2a\u63d0\u793a\uff1a\n\n1. \u4fdd\u6301\u8eab\u4f53\u6d3b\u52a8\u3002\u6bcf\u5929\u505a\u9002\u5f53\u7684\u8eab\u4f53\u8fd0\u52a8\uff0c\u5982\u6563\u6b65\u3001\u8dd1\u6b65\u6216\u6e38\u6cf3\uff0c\u80fd\u4fc3\u8fdb\u5fc3\u8840\u7ba1\u5065\u5eb7\uff0c\u589e\u5f3a\u808c\u8089\u529b\u91cf\uff0c\u5e76\u6709\u52a9\u4e8e\u51cf\u5c11\u4f53\u91cd\u3002\n\n2. \u5747\u8861\u996e\u98df\u3002\u6bcf\u5929\u98df\u7528\u65b0\u9c9c\u7684\u852c\u83dc\u3001\u6c34\u679c\u3001\u5168\u8c37\u7269\u548c\u8102\u80aa\u542b\u91cf\u4f4e\u7684\u86cb\u767d\u8d28\u98df\u7269\uff0c\u907f\u514d\u9ad8\u7cd6\u3001\u9ad8\u8102\u80aa\u548c\u52a0\u5de5\u98df\u54c1\uff0c\u4ee5\u4fdd\u6301\u5065\u5eb7\u7684\u996e\u98df\u4e60\u60ef\u3002\n\n3. \u7761\u7720\u5145\u8db3\u3002\u7761\u7720\u5bf9\u4eba\u4f53\u5065\u5eb7\u81f3\u5173\u91cd\u8981\uff0c\u6210\u5e74\u4eba\u6bcf\u5929\u5e94\u4fdd\u8bc1 7-8 \u5c0f\u65f6\u7684\u7761\u7720\u3002\u826f\u597d\u7684\u7761\u7720\u6709\u52a9\u4e8e\u51cf\u8f7b\u538b\u529b\uff0c\u4fc3\u8fdb\u8eab\u4f53\u6062\u590d\uff0c\u5e76\u63d0\u9ad8\u6ce8\u610f\u529b\u548c\u8bb0\u5fc6\u529b\u3002",
"preference": [
{
"status": "submitted",
"user_id": "c658ddde-2d39-43ce-b478-633a1d19d2c7",
"value": {
"rank": [
2,
3,
1
],
"value": [
"A",
"B",
"C"
]
}
}
],
"preference-suggestion": null,
"preference-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"question-rating": [
{
"status": "submitted",
"user_id": "c658ddde-2d39-43ce-b478-633a1d19d2c7",
"value": 4
}
],
"question-rating-suggestion": null,
"question-rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"sentiment": [
{
"status": "submitted",
"user_id": "c658ddde-2d39-43ce-b478-633a1d19d2c7",
"value": "\u4e2d\u7acb"
}
],
"sentiment-suggestion": null,
"sentiment-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"suggestion": [
{
"status": "submitted",
"user_id": "c658ddde-2d39-43ce-b478-633a1d19d2c7",
"value": "111"
}
],
"suggestion-suggestion": null,
"suggestion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"topics": [
{
"status": "submitted",
"user_id": "c658ddde-2d39-43ce-b478-633a1d19d2c7",
"value": [
"\u903b\u8f91\u95ee\u9898",
"\u5730\u7406\u77e5\u8bc6"
]
}
],
"topics-suggestion": null,
"topics-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **instruction** is of type `text`.
* **input** is of type `text`.
* **output** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* (optional) **question-rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9].
* **context-rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9].
* **preference** is of type `ranking` with the following allowed values ['A', 'B', 'C'].
* **suggestion** is of type `text`.
* **topics** is of type `multi_label_selection` with the following allowed values ['数学问题', '逻辑问题', '地理知识', '文学知识', '居家知识', '行业知识'].
* **sentiment** is of type `label_selection` with the following allowed values ['积极', '中立', '消极'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **question-rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9].
* (optional) **context-rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9].
* (optional) **preference-suggestion** is of type `ranking` with the following allowed values ['A', 'B', 'C'].
* (optional) **suggestion-suggestion** is of type `text`.
* (optional) **topics-suggestion** is of type `multi_label_selection` with the following allowed values ['数学问题', '逻辑问题', '地理知识', '文学知识', '居家知识', '行业知识'].
* (optional) **sentiment-suggestion** is of type `label_selection` with the following allowed values ['积极', '中立', '消极'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kochou_shinobu_demonslayer | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kochou Shinobu (Demon Slayer)
This is the dataset of Kochou Shinobu (Demon Slayer), containing 92 images and their tags.
The core tags of this character are `black_hair, hair_ornament, multicolored_hair, purple_hair, butterfly_hair_ornament, short_hair, gradient_hair, purple_eyes, no_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 91.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kochou_shinobu_demonslayer/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 92 | 66.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kochou_shinobu_demonslayer/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 189 | 127.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kochou_shinobu_demonslayer/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 92 | 91.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kochou_shinobu_demonslayer/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 189 | 163.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kochou_shinobu_demonslayer/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kochou_shinobu_demonslayer',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | 1girl, demon_slayer_uniform, solo, haori, smile, closed_mouth, empty_eyes, upper_body, anime_coloring, black_jacket, portrait |
| 1 | 6 |  |  |  |  |  | 1girl, blurry_background, demon_slayer_uniform, haori, outdoors, solo, day, katana, upper_body, empty_eyes, black_jacket |
| 2 | 5 |  |  |  |  |  | 1girl, demon_slayer_uniform, haori, holding_sword, katana, long_sleeves, solo, looking_at_viewer, smile, belt, empty_eyes, open_mouth, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | demon_slayer_uniform | solo | haori | smile | closed_mouth | empty_eyes | upper_body | anime_coloring | black_jacket | portrait | blurry_background | outdoors | day | katana | holding_sword | long_sleeves | looking_at_viewer | belt | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------------|:-------|:--------|:--------|:---------------|:-------------|:-------------|:-----------------|:---------------|:-----------|:--------------------|:-----------|:------|:---------|:----------------|:---------------|:--------------------|:-------|:-------------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | X | X | | X | | X | X | X | X | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | | | | | | X | X | X | X | X | X |
|
yzhuang/autotree_automl_credit_gosdt_l512_d3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 1576062930
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_credit_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maadm-nlp-group-b/maadm-nlp-assignment | ---
license: apache-2.0
---
|
tog/galleon-llama2-27k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4280355.9
num_examples: 24300
- name: test
num_bytes: 475595.1
num_examples: 2700
download_size: 2318132
dataset_size: 4755951.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "galleon-llama2-27k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepapaikar/Llama_SentencePairs | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2066912
num_examples: 5603
download_size: 894892
dataset_size: 2066912
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yfqiu-nlp/mfact-classification | ---
license: mit
dataset_info:
features:
- name: en
dtype: string
- name: zh
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: vi
dtype: string
- name: hi
dtype: string
- name: tr
dtype: string
splits:
- name: train
num_bytes: 182
num_examples: 3
download_size: 3134
dataset_size: 182
---
|
irds/clinicaltrials_2021_trec-ct-2021 | ---
pretty_name: '`clinicaltrials/2021/trec-ct-2021`'
viewer: false
source_datasets: ['irds/clinicaltrials_2021']
task_categories:
- text-retrieval
---
# Dataset Card for `clinicaltrials/2021/trec-ct-2021`
The `clinicaltrials/2021/trec-ct-2021` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clinicaltrials#clinicaltrials/2021/trec-ct-2021).
# Data
This dataset provides:
- `queries` (i.e., topics); count=75
- `qrels`: (relevance assessments); count=35,832
- For `docs`, use [`irds/clinicaltrials_2021`](https://huggingface.co/datasets/irds/clinicaltrials_2021)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/clinicaltrials_2021_trec-ct-2021', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/clinicaltrials_2021_trec-ct-2021', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
pkr7098/CharacterTrajectories_eq | ---
license: cc-by-4.0
---
# CharacterTrajectories_eq |
rntc/blurb_bc2gm_a-0-tm | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: type
dtype: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B
'2': I
splits:
- name: train
num_bytes: 34066605
num_examples: 12574
- name: validation
num_bytes: 6752317
num_examples: 2519
- name: test
num_bytes: 13374141
num_examples: 5038
download_size: 9167736
dataset_size: 54193063
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
lhallee/dl_binary_reg | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: seqs
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 2692075
num_examples: 5473
- name: valid
num_bytes: 653234
num_examples: 1335
- name: test
num_bytes: 905979
num_examples: 1729
download_size: 4189564
dataset_size: 4251288
---
# Dataset Card for "dl_binary_reg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ghomasHudson/longdoc_paired_style_change | ---
dataset_info:
features:
- name: input
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
splits:
- name: train
num_bytes: 388661283
num_examples: 109799
- name: validation
num_bytes: 41678547
num_examples: 11799
download_size: 0
dataset_size: 430339830
---
# Dataset Card for "longdoc_paired_style_change"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b | ---
pretty_name: Evaluation run of macadeliccc/laser-polyglot-4x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/laser-polyglot-4x7b](https://huggingface.co/macadeliccc/laser-polyglot-4x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T01:28:04.517036](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b/blob/main/results_2024-01-14T01-28-04.517036.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6383969687290681,\n\
\ \"acc_stderr\": 0.032222378716622334,\n \"acc_norm\": 0.6424348983154926,\n\
\ \"acc_norm_stderr\": 0.03285947296719794,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5546852358397387,\n\
\ \"mc2_stderr\": 0.015162772354647294\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.01425856388051378,\n\
\ \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859857\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6581358295160327,\n\
\ \"acc_stderr\": 0.0047336492748145075,\n \"acc_norm\": 0.8498307110137423,\n\
\ \"acc_norm_stderr\": 0.0035650718701954478\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343135,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343135\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n\
\ \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265016,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265016\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.0160837499868537,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.0160837499868537\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\
\ \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n\
\ \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824876,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824876\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5546852358397387,\n\
\ \"mc2_stderr\": 0.015162772354647294\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4844579226686884,\n \
\ \"acc_stderr\": 0.013765829454512891\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/laser-polyglot-4x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-28-04.517036.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- '**/details_harness|winogrande|5_2024-01-14T01-28-04.517036.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T01-28-04.517036.parquet'
- config_name: results
data_files:
- split: 2024_01_14T01_28_04.517036
path:
- results_2024-01-14T01-28-04.517036.parquet
- split: latest
path:
- results_2024-01-14T01-28-04.517036.parquet
---
# Dataset Card for Evaluation run of macadeliccc/laser-polyglot-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-polyglot-4x7b](https://huggingface.co/macadeliccc/laser-polyglot-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:28:04.517036](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b/blob/main/results_2024-01-14T01-28-04.517036.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6383969687290681,
"acc_stderr": 0.032222378716622334,
"acc_norm": 0.6424348983154926,
"acc_norm_stderr": 0.03285947296719794,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5546852358397387,
"mc2_stderr": 0.015162772354647294
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.6416382252559727,
"acc_norm_stderr": 0.014012883334859857
},
"harness|hellaswag|10": {
"acc": 0.6581358295160327,
"acc_stderr": 0.0047336492748145075,
"acc_norm": 0.8498307110137423,
"acc_norm_stderr": 0.0035650718701954478
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343135,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265016,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265016
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.0160837499868537,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.0160837499868537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824876,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824876
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5546852358397387,
"mc2_stderr": 0.015162772354647294
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
},
"harness|gsm8k|5": {
"acc": 0.4844579226686884,
"acc_stderr": 0.013765829454512891
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ghoskno/laion-art-en-colorcanny | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 507481937115.0
num_examples: 2639345
download_size: 48871327240
dataset_size: 507481937115.0
---
# Dataset Card for "laion-art-en-colorcanny"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kelvinyasu/autotrain-data-xcdn2 | ---
language:
- en
---
# AutoTrain Dataset for project: xcdn2
## Dataset Description
This dataset has been automatically processed by AutoTrain for project xcdn2.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "To be eligible for the PMP certification, you need to meet certain educational and professional experience requirements. The prerequisites depend on your level of education.",
"question": "What are the prerequisites for PMP certification?",
"answers.text": [
"To be eligible for the PMP certification, you need to meet certain educational and professional experience requirements. The prerequisites depend on your level of education."
],
"answers.answer_start": [
0
]
},
{
"context": "Project management is the practice of initiating, planning, executing, controlling, and closing the work of a team to achieve specific goals and meet specific success criteria.",
"question": "What is project management?",
"answers.text": [
"Project management is the practice of initiating, planning, executing, controlling, and closing the work of a team to achieve specific goals and meet specific success criteria."
],
"answers.answer_start": [
0
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 8 |
| valid | 2 |
|
CyberHarem/belka_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of belka (Fire Emblem)
This is the dataset of belka (Fire Emblem), containing 106 images and their tags.
The core tags of this character are `blue_hair, short_hair, headband, breasts, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 106 | 110.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/belka_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 106 | 63.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/belka_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 231 | 124.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/belka_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 106 | 98.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/belka_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 231 | 173.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/belka_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/belka_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, solo, armor, cape, looking_at_viewer, simple_background, upper_body, closed_mouth, holding_weapon, torn_clothes, white_background |
| 1 | 9 |  |  |  |  |  | 1girl, solo, armor, scarf, upper_body, gauntlets, weapon, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | armor | cape | looking_at_viewer | simple_background | upper_body | closed_mouth | holding_weapon | torn_clothes | white_background | scarf | gauntlets | weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------|:--------------------|:--------------------|:-------------|:---------------|:-----------------|:---------------|:-------------------|:--------|:------------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | | | | X | | | | X | X | X | X |
|
autoevaluate/autoeval-eval-billsum-default-e7f679-2243071585 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- billsum
eval_info:
task: summarization
model: Artifact-AI/led_large_16384_billsum_summarization
metrics: []
dataset_name: billsum
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Artifact-AI/led_large_16384_billsum_summarization
* Dataset: billsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Artifact-AI](https://huggingface.co/Artifact-AI) for evaluating this model. |
ibranze/araproje_hellaswag_tr_conf_mgpt_nearestscore_true_y | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87120
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_mgpt_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_harborwater__open-llama-3b-everything-v2 | ---
pretty_name: Evaluation run of harborwater/open-llama-3b-everything-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [harborwater/open-llama-3b-everything-v2](https://huggingface.co/harborwater/open-llama-3b-everything-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__open-llama-3b-everything-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T00:43:57.732775](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everything-v2/blob/main/results_2023-10-29T00-43-57.732775.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.0004685065030368325,\n \"f1\": 0.0560864093959733,\n\
\ \"f1_stderr\": 0.0013597729822813858,\n \"acc\": 0.341030820866541,\n\
\ \"acc_stderr\": 0.008350924483766176\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.0004685065030368325,\n\
\ \"f1\": 0.0560864093959733,\n \"f1_stderr\": 0.0013597729822813858\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723889915\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6661404893449092,\n \"acc_stderr\": 0.013254029695143358\n\
\ }\n}\n```"
repo_url: https://huggingface.co/harborwater/open-llama-3b-everything-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|arc:challenge|25_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T00_43_57.732775
path:
- '**/details_harness|drop|3_2023-10-29T00-43-57.732775.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T00-43-57.732775.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T00_43_57.732775
path:
- '**/details_harness|gsm8k|5_2023-10-29T00-43-57.732775.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T00-43-57.732775.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hellaswag|10_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T09-37-10.252705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T09-37-10.252705.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T09-37-10.252705.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T00_43_57.732775
path:
- '**/details_harness|winogrande|5_2023-10-29T00-43-57.732775.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T00-43-57.732775.parquet'
- config_name: results
data_files:
- split: 2023_10_12T09_37_10.252705
path:
- results_2023-10-12T09-37-10.252705.parquet
- split: 2023_10_29T00_43_57.732775
path:
- results_2023-10-29T00-43-57.732775.parquet
- split: latest
path:
- results_2023-10-29T00-43-57.732775.parquet
---
# Dataset Card for Evaluation run of harborwater/open-llama-3b-everything-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/harborwater/open-llama-3b-everything-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [harborwater/open-llama-3b-everything-v2](https://huggingface.co/harborwater/open-llama-3b-everything-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harborwater__open-llama-3b-everything-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T00:43:57.732775](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everything-v2/blob/main/results_2023-10-29T00-43-57.732775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.0004685065030368325,
"f1": 0.0560864093959733,
"f1_stderr": 0.0013597729822813858,
"acc": 0.341030820866541,
"acc_stderr": 0.008350924483766176
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.0004685065030368325,
"f1": 0.0560864093959733,
"f1_stderr": 0.0013597729822813858
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723889915
},
"harness|winogrande|5": {
"acc": 0.6661404893449092,
"acc_stderr": 0.013254029695143358
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DopeorNope/new_instruct2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 404814227
num_examples: 97609
download_size: 200280865
dataset_size: 404814227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bartoszmaj/stance_predictions | ---
license: openrail
dataset_info:
features:
- name: predictions
dtype: string
splits:
- name: train
num_bytes: 38492181
num_examples: 4600698
download_size: 2400251
dataset_size: 38492181
---
|
Hansollll/Translation | ---
dataset_info:
features:
- name: sn
dtype: string
- name: translation
struct:
- name: en
dtype: string
- name: ko
dtype: string
splits:
- name: train
num_bytes: 2460095.2
num_examples: 8000
- name: test
num_bytes: 615023.8
num_examples: 2000
download_size: 1973746
dataset_size: 3075119.0
---
# Dataset Card for "Translation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_me_coordinate_subjects | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1723
num_examples: 7
- name: dev_mismatched
num_bytes: 6219
num_examples: 26
- name: test_matched
num_bytes: 5493
num_examples: 17
- name: test_mismatched
num_bytes: 4076
num_examples: 17
- name: train
num_bytes: 136120
num_examples: 549
download_size: 75026
dataset_size: 153631
---
# Dataset Card for "MULTI_VALUE_mnli_me_coordinate_subjects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BiancoMat/metamat | ---
tags:
- art
--- |
razent/vi_pubmed_small | ---
language:
- vi
size_categories:
- 10K<n<100K
---
10K Vietnamese abstracts extracted from `VietAI/vi_pubmed` for demo purposes only. |
Nexdata/Mandarin_Pronunciation_Dictionary | ---
task_categories:
- automatic-speech-recognition
language:
- zh
---
# Dataset Card for Nexdata/Mandarin_Pronunciation_Dictionary
## Description
The data contains 570,060 entries. All words and pronunciations are produced by linguists. It can be used in the research and development of Mandarin ASR technology.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1094?source=Huggingface
# Specifications
## Format
TXT
## Data content
570,060 Mandarin words and corresponding pinyin
## Language
Mandarin
## Application scenario
speech recognition
# Licensing Information
Commercial License |
kamilakesbi/callhome_jpn | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: timestamps_start
sequence: float64
- name: timestamps_end
sequence: float64
- name: speakers
sequence: string
splits:
- name: data
num_bytes: 2159798942.0
num_examples: 120
download_size: 2119318800
dataset_size: 2159798942.0
configs:
- config_name: default
data_files:
- split: data
path: data/data-*
---
|
jinlibao/lima_preference_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
splits:
- name: train
num_bytes: 133888
num_examples: 50
download_size: 89232
dataset_size: 133888
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
carnival13/eng_sur_2_DA_tokenized | ---
dataset_info:
features:
- name: pass_label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 127645465
num_examples: 160590
download_size: 26460153
dataset_size: 127645465
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eng_sur_2_DA_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MediaTek-Research/TCEval-v2 | ---
dataset_info:
- config_name: drcd
features:
- name: id
dtype: string
- name: paragraph
dtype: string
- name: question
dtype: string
- name: references
list: string
splits:
- name: test
num_bytes: 4899369
num_examples: 3493
- name: dev
num_bytes: 5845
num_examples: 5
download_size: 1168539
dataset_size: 4905214
- config_name: mt_bench_tw-coding
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 11252
num_examples: 10
download_size: 10860
dataset_size: 11252
- config_name: mt_bench_tw-extraction
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 10882
num_examples: 10
download_size: 17098
dataset_size: 10882
- config_name: mt_bench_tw-humanities
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 2996
num_examples: 10
download_size: 5049
dataset_size: 2996
- config_name: mt_bench_tw-math
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 3041
num_examples: 10
download_size: 5054
dataset_size: 3041
- config_name: mt_bench_tw-reasoning
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 4492
num_examples: 10
download_size: 8402
dataset_size: 4492
- config_name: mt_bench_tw-roleplay
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 4134
num_examples: 10
download_size: 6634
dataset_size: 4134
- config_name: mt_bench_tw-stem
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 3103
num_examples: 10
download_size: 5430
dataset_size: 3103
- config_name: mt_bench_tw-writing
features:
- name: id
dtype: string
- name: turns
list: string
- name: reference
list: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 3469
num_examples: 10
download_size: 6701
dataset_size: 3469
- config_name: penguin_table
features:
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: answer
dtype: string
- name: id
dtype: string
splits:
- name: dev
num_bytes: 2588
num_examples: 5
- name: test
num_bytes: 74241
num_examples: 144
download_size: 21218
dataset_size: 76829
- config_name: tmmluplus-accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 134876
num_examples: 191
- name: dev
num_bytes: 3764
num_examples: 5
download_size: 87921
dataset_size: 138640
- config_name: tmmluplus-administrative_law
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 169553
num_examples: 420
- name: dev
num_bytes: 2567
num_examples: 5
download_size: 107897
dataset_size: 172120
- config_name: tmmluplus-advance_chemistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 33891
num_examples: 123
- name: dev
num_bytes: 1581
num_examples: 5
download_size: 34210
dataset_size: 35472
- config_name: tmmluplus-agriculture
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 46502
num_examples: 151
- name: dev
num_bytes: 1715
num_examples: 5
download_size: 40849
dataset_size: 48217
- config_name: tmmluplus-anti_money_laundering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 54293
num_examples: 134
- name: dev
num_bytes: 2552
num_examples: 5
download_size: 47614
dataset_size: 56845
- config_name: tmmluplus-auditing
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 272426
num_examples: 550
- name: dev
num_bytes: 1947
num_examples: 5
download_size: 147664
dataset_size: 274373
- config_name: tmmluplus-basic_medical_science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 312503
num_examples: 954
- name: dev
num_bytes: 1599
num_examples: 5
download_size: 194337
dataset_size: 314102
- config_name: tmmluplus-business_management
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 45074
num_examples: 139
- name: dev
num_bytes: 1403
num_examples: 5
download_size: 39338
dataset_size: 46477
- config_name: tmmluplus-chinese_language_and_literature
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 127469
num_examples: 199
- name: dev
num_bytes: 2054
num_examples: 5
download_size: 103909
dataset_size: 129523
- config_name: tmmluplus-clinical_psychology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 55748
num_examples: 125
- name: dev
num_bytes: 2029
num_examples: 5
download_size: 51770
dataset_size: 57777
- config_name: tmmluplus-computer_science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 57883
num_examples: 174
- name: dev
num_bytes: 1894
num_examples: 5
download_size: 49090
dataset_size: 59777
- config_name: tmmluplus-culinary_skills
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 94564
num_examples: 292
- name: dev
num_bytes: 1540
num_examples: 5
download_size: 69998
dataset_size: 96104
- config_name: tmmluplus-dentistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 152113
num_examples: 399
- name: dev
num_bytes: 1684
num_examples: 5
download_size: 105595
dataset_size: 153797
- config_name: tmmluplus-economics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 145972
num_examples: 393
- name: dev
num_bytes: 1946
num_examples: 5
download_size: 91284
dataset_size: 147918
- config_name: tmmluplus-education
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 44729
num_examples: 124
- name: dev
num_bytes: 1760
num_examples: 5
download_size: 41837
dataset_size: 46489
- config_name: tmmluplus-education_(profession_level)
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 208632
num_examples: 486
- name: dev
num_bytes: 3183
num_examples: 5
download_size: 136861
dataset_size: 211815
- config_name: tmmluplus-educational_psychology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 71860
num_examples: 176
- name: dev
num_bytes: 2314
num_examples: 5
download_size: 56964
dataset_size: 74174
- config_name: tmmluplus-engineering_math
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 35214
num_examples: 103
- name: dev
num_bytes: 1954
num_examples: 5
download_size: 33378
dataset_size: 37168
- config_name: tmmluplus-finance_banking
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 59005
num_examples: 135
- name: dev
num_bytes: 2232
num_examples: 5
download_size: 47576
dataset_size: 61237
- config_name: tmmluplus-financial_analysis
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 128903
num_examples: 382
- name: dev
num_bytes: 1537
num_examples: 5
download_size: 68492
dataset_size: 130440
- config_name: tmmluplus-fire_science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 37661
num_examples: 124
- name: dev
num_bytes: 1690
num_examples: 5
download_size: 33612
dataset_size: 39351
- config_name: tmmluplus-general_principles_of_law
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 47582
num_examples: 106
- name: dev
num_bytes: 1777
num_examples: 5
download_size: 40369
dataset_size: 49359
- config_name: tmmluplus-geography_of_taiwan
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 242009
num_examples: 768
- name: dev
num_bytes: 1689
num_examples: 5
download_size: 144499
dataset_size: 243698
- config_name: tmmluplus-human_behavior
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 132226
num_examples: 309
- name: dev
num_bytes: 2149
num_examples: 5
download_size: 93526
dataset_size: 134375
- config_name: tmmluplus-insurance_studies
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 349058
num_examples: 760
- name: dev
num_bytes: 2023
num_examples: 5
download_size: 174957
dataset_size: 351081
- config_name: tmmluplus-introduction_to_law
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 93914
num_examples: 237
- name: dev
num_bytes: 3868
num_examples: 5
download_size: 72390
dataset_size: 97782
- config_name: tmmluplus-jce_humanities
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 95795
num_examples: 90
- name: dev
num_bytes: 6230
num_examples: 5
download_size: 79879
dataset_size: 102025
- config_name: tmmluplus-junior_chemistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 56079
num_examples: 209
- name: dev
num_bytes: 1472
num_examples: 5
download_size: 44646
dataset_size: 57551
- config_name: tmmluplus-junior_chinese_exam
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 169271
num_examples: 175
- name: dev
num_bytes: 7581
num_examples: 5
download_size: 139825
dataset_size: 176852
- config_name: tmmluplus-junior_math_exam
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 51452
num_examples: 175
- name: dev
num_bytes: 1511
num_examples: 5
download_size: 38704
dataset_size: 52963
- config_name: tmmluplus-junior_science_exam
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 105830
num_examples: 213
- name: dev
num_bytes: 2473
num_examples: 5
download_size: 78758
dataset_size: 108303
- config_name: tmmluplus-junior_social_studies
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 92873
num_examples: 126
- name: dev
num_bytes: 4171
num_examples: 5
download_size: 76559
dataset_size: 97044
- config_name: tmmluplus-logic_reasoning
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 40639
num_examples: 139
- name: dev
num_bytes: 1591
num_examples: 5
download_size: 31931
dataset_size: 42230
- config_name: tmmluplus-macroeconomics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 125238
num_examples: 411
- name: dev
num_bytes: 1510
num_examples: 5
download_size: 76559
dataset_size: 126748
- config_name: tmmluplus-management_accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 105401
num_examples: 215
- name: dev
num_bytes: 2212
num_examples: 5
download_size: 63286
dataset_size: 107613
- config_name: tmmluplus-marketing_management
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 32431
num_examples: 93
- name: dev
num_bytes: 1802
num_examples: 5
download_size: 32600
dataset_size: 34233
- config_name: tmmluplus-mechanical
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 32709
num_examples: 118
- name: dev
num_bytes: 1112
num_examples: 5
download_size: 30409
dataset_size: 33821
- config_name: tmmluplus-music
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 91304
num_examples: 278
- name: dev
num_bytes: 1598
num_examples: 5
download_size: 68538
dataset_size: 92902
- config_name: tmmluplus-national_protection
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 55256
num_examples: 211
- name: dev
num_bytes: 1186
num_examples: 5
download_size: 42755
dataset_size: 56442
- config_name: tmmluplus-nautical_science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 163848
num_examples: 551
- name: dev
num_bytes: 1131
num_examples: 5
download_size: 97058
dataset_size: 164979
- config_name: tmmluplus-occupational_therapy_for_psychological_disorders
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 268018
num_examples: 543
- name: dev
num_bytes: 2198
num_examples: 5
download_size: 152382
dataset_size: 270216
- config_name: tmmluplus-official_document_management
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 67868
num_examples: 222
- name: dev
num_bytes: 1752
num_examples: 5
download_size: 42263
dataset_size: 69620
- config_name: tmmluplus-optometry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 367273
num_examples: 920
- name: dev
num_bytes: 1756
num_examples: 5
download_size: 197708
dataset_size: 369029
- config_name: tmmluplus-organic_chemistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 29720
num_examples: 109
- name: dev
num_bytes: 1316
num_examples: 5
download_size: 31856
dataset_size: 31036
- config_name: tmmluplus-pharmacology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 164131
num_examples: 577
- name: dev
num_bytes: 1040
num_examples: 5
download_size: 94751
dataset_size: 165171
- config_name: tmmluplus-pharmacy
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 113563
num_examples: 391
- name: dev
num_bytes: 1252
num_examples: 5
download_size: 77275
dataset_size: 114815
- config_name: tmmluplus-physical_education
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 47469
num_examples: 179
- name: dev
num_bytes: 1202
num_examples: 5
download_size: 39538
dataset_size: 48671
- config_name: tmmluplus-physics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 30030
num_examples: 97
- name: dev
num_bytes: 1191
num_examples: 5
download_size: 30370
dataset_size: 31221
- config_name: tmmluplus-politic_science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 279612
num_examples: 995
- name: dev
num_bytes: 1444
num_examples: 5
download_size: 155705
dataset_size: 281056
- config_name: tmmluplus-real_estate
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 38600
num_examples: 92
- name: dev
num_bytes: 2599
num_examples: 5
download_size: 36955
dataset_size: 41199
- config_name: tmmluplus-secondary_physics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 45698
num_examples: 112
- name: dev
num_bytes: 1686
num_examples: 5
download_size: 41917
dataset_size: 47384
- config_name: tmmluplus-statistics_and_machine_learning
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 83999
num_examples: 224
- name: dev
num_bytes: 2368
num_examples: 5
download_size: 64213
dataset_size: 86367
- config_name: tmmluplus-taiwanese_hokkien
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 40896
num_examples: 129
- name: dev
num_bytes: 2197
num_examples: 5
download_size: 40308
dataset_size: 43093
- config_name: tmmluplus-taxation
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 154730
num_examples: 375
- name: dev
num_bytes: 1924
num_examples: 5
download_size: 97906
dataset_size: 156654
- config_name: tmmluplus-technical
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 94384
num_examples: 402
- name: dev
num_bytes: 1084
num_examples: 5
download_size: 60659
dataset_size: 95468
- config_name: tmmluplus-three_principles_of_people
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 33261
num_examples: 139
- name: dev
num_bytes: 1234
num_examples: 5
download_size: 28540
dataset_size: 34495
- config_name: tmmluplus-trade
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 179952
num_examples: 502
- name: dev
num_bytes: 1679
num_examples: 5
download_size: 98998
dataset_size: 181631
- config_name: tmmluplus-traditional_chinese_medicine_clinical_medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 115490
num_examples: 278
- name: dev
num_bytes: 1922
num_examples: 5
download_size: 76367
dataset_size: 117412
- config_name: tmmluplus-trust_practice
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 155403
num_examples: 401
- name: dev
num_bytes: 2556
num_examples: 5
download_size: 94795
dataset_size: 157959
- config_name: tmmluplus-ttqav2
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 41379
num_examples: 113
- name: dev
num_bytes: 2246
num_examples: 5
download_size: 40353
dataset_size: 43625
- config_name: tmmluplus-tve_chinese_language
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 539326
num_examples: 483
- name: dev
num_bytes: 5360
num_examples: 5
download_size: 401013
dataset_size: 544686
- config_name: tmmluplus-tve_design
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 182865
num_examples: 480
- name: dev
num_bytes: 2304
num_examples: 5
download_size: 119979
dataset_size: 185169
- config_name: tmmluplus-tve_mathematics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 42519
num_examples: 150
- name: dev
num_bytes: 1290
num_examples: 5
download_size: 36304
dataset_size: 43809
- config_name: tmmluplus-tve_natural_sciences
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 139853
num_examples: 424
- name: dev
num_bytes: 2163
num_examples: 5
download_size: 100220
dataset_size: 142016
- config_name: tmmluplus-veterinary_pathology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 91700
num_examples: 283
- name: dev
num_bytes: 1803
num_examples: 5
download_size: 59000
dataset_size: 93503
- config_name: tmmluplus-veterinary_pharmacology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 151825
num_examples: 540
- name: dev
num_bytes: 1419
num_examples: 5
download_size: 81980
dataset_size: 153244
configs:
- config_name: drcd
data_files:
- split: test
path: drcd/test-*
- split: dev
path: drcd/dev-*
- config_name: mt_bench_tw-coding
data_files:
- split: test
path: mt_bench_tw-coding/test-*
- config_name: mt_bench_tw-extraction
data_files:
- split: test
path: mt_bench_tw-extraction/test-*
- config_name: mt_bench_tw-humanities
data_files:
- split: test
path: mt_bench_tw-humanities/test-*
- config_name: mt_bench_tw-math
data_files:
- split: test
path: mt_bench_tw-math/test-*
- config_name: mt_bench_tw-reasoning
data_files:
- split: test
path: mt_bench_tw-reasoning/test-*
- config_name: mt_bench_tw-roleplay
data_files:
- split: test
path: mt_bench_tw-roleplay/test-*
- config_name: mt_bench_tw-stem
data_files:
- split: test
path: mt_bench_tw-stem/test-*
- config_name: mt_bench_tw-writing
data_files:
- split: test
path: mt_bench_tw-writing/test-*
- config_name: penguin_table
data_files:
- split: dev
path: penguin_table/dev-*
- split: test
path: penguin_table/test-*
- config_name: tmmluplus-accounting
data_files:
- split: test
path: tmmluplus-accounting/test-*
- split: dev
path: tmmluplus-accounting/dev-*
- config_name: tmmluplus-administrative_law
data_files:
- split: test
path: tmmluplus-administrative_law/test-*
- split: dev
path: tmmluplus-administrative_law/dev-*
- config_name: tmmluplus-advance_chemistry
data_files:
- split: test
path: tmmluplus-advance_chemistry/test-*
- split: dev
path: tmmluplus-advance_chemistry/dev-*
- config_name: tmmluplus-agriculture
data_files:
- split: test
path: tmmluplus-agriculture/test-*
- split: dev
path: tmmluplus-agriculture/dev-*
- config_name: tmmluplus-anti_money_laundering
data_files:
- split: test
path: tmmluplus-anti_money_laundering/test-*
- split: dev
path: tmmluplus-anti_money_laundering/dev-*
- config_name: tmmluplus-auditing
data_files:
- split: test
path: tmmluplus-auditing/test-*
- split: dev
path: tmmluplus-auditing/dev-*
- config_name: tmmluplus-basic_medical_science
data_files:
- split: test
path: tmmluplus-basic_medical_science/test-*
- split: dev
path: tmmluplus-basic_medical_science/dev-*
- config_name: tmmluplus-business_management
data_files:
- split: test
path: tmmluplus-business_management/test-*
- split: dev
path: tmmluplus-business_management/dev-*
- config_name: tmmluplus-chinese_language_and_literature
data_files:
- split: test
path: tmmluplus-chinese_language_and_literature/test-*
- split: dev
path: tmmluplus-chinese_language_and_literature/dev-*
- config_name: tmmluplus-clinical_psychology
data_files:
- split: test
path: tmmluplus-clinical_psychology/test-*
- split: dev
path: tmmluplus-clinical_psychology/dev-*
- config_name: tmmluplus-computer_science
data_files:
- split: test
path: tmmluplus-computer_science/test-*
- split: dev
path: tmmluplus-computer_science/dev-*
- config_name: tmmluplus-culinary_skills
data_files:
- split: test
path: tmmluplus-culinary_skills/test-*
- split: dev
path: tmmluplus-culinary_skills/dev-*
- config_name: tmmluplus-dentistry
data_files:
- split: test
path: tmmluplus-dentistry/test-*
- split: dev
path: tmmluplus-dentistry/dev-*
- config_name: tmmluplus-economics
data_files:
- split: test
path: tmmluplus-economics/test-*
- split: dev
path: tmmluplus-economics/dev-*
- config_name: tmmluplus-education
data_files:
- split: test
path: tmmluplus-education/test-*
- split: dev
path: tmmluplus-education/dev-*
- config_name: tmmluplus-education_(profession_level)
data_files:
- split: test
path: tmmluplus-education_(profession_level)/test-*
- split: dev
path: tmmluplus-education_(profession_level)/dev-*
- config_name: tmmluplus-educational_psychology
data_files:
- split: test
path: tmmluplus-educational_psychology/test-*
- split: dev
path: tmmluplus-educational_psychology/dev-*
- config_name: tmmluplus-engineering_math
data_files:
- split: test
path: tmmluplus-engineering_math/test-*
- split: dev
path: tmmluplus-engineering_math/dev-*
- config_name: tmmluplus-finance_banking
data_files:
- split: test
path: tmmluplus-finance_banking/test-*
- split: dev
path: tmmluplus-finance_banking/dev-*
- config_name: tmmluplus-financial_analysis
data_files:
- split: test
path: tmmluplus-financial_analysis/test-*
- split: dev
path: tmmluplus-financial_analysis/dev-*
- config_name: tmmluplus-fire_science
data_files:
- split: test
path: tmmluplus-fire_science/test-*
- split: dev
path: tmmluplus-fire_science/dev-*
- config_name: tmmluplus-general_principles_of_law
data_files:
- split: test
path: tmmluplus-general_principles_of_law/test-*
- split: dev
path: tmmluplus-general_principles_of_law/dev-*
- config_name: tmmluplus-geography_of_taiwan
data_files:
- split: test
path: tmmluplus-geography_of_taiwan/test-*
- split: dev
path: tmmluplus-geography_of_taiwan/dev-*
- config_name: tmmluplus-human_behavior
data_files:
- split: test
path: tmmluplus-human_behavior/test-*
- split: dev
path: tmmluplus-human_behavior/dev-*
- config_name: tmmluplus-insurance_studies
data_files:
- split: test
path: tmmluplus-insurance_studies/test-*
- split: dev
path: tmmluplus-insurance_studies/dev-*
- config_name: tmmluplus-introduction_to_law
data_files:
- split: test
path: tmmluplus-introduction_to_law/test-*
- split: dev
path: tmmluplus-introduction_to_law/dev-*
- config_name: tmmluplus-jce_humanities
data_files:
- split: test
path: tmmluplus-jce_humanities/test-*
- split: dev
path: tmmluplus-jce_humanities/dev-*
- config_name: tmmluplus-junior_chemistry
data_files:
- split: test
path: tmmluplus-junior_chemistry/test-*
- split: dev
path: tmmluplus-junior_chemistry/dev-*
- config_name: tmmluplus-junior_chinese_exam
data_files:
- split: test
path: tmmluplus-junior_chinese_exam/test-*
- split: dev
path: tmmluplus-junior_chinese_exam/dev-*
- config_name: tmmluplus-junior_math_exam
data_files:
- split: test
path: tmmluplus-junior_math_exam/test-*
- split: dev
path: tmmluplus-junior_math_exam/dev-*
- config_name: tmmluplus-junior_science_exam
data_files:
- split: test
path: tmmluplus-junior_science_exam/test-*
- split: dev
path: tmmluplus-junior_science_exam/dev-*
- config_name: tmmluplus-junior_social_studies
data_files:
- split: test
path: tmmluplus-junior_social_studies/test-*
- split: dev
path: tmmluplus-junior_social_studies/dev-*
- config_name: tmmluplus-logic_reasoning
data_files:
- split: test
path: tmmluplus-logic_reasoning/test-*
- split: dev
path: tmmluplus-logic_reasoning/dev-*
- config_name: tmmluplus-macroeconomics
data_files:
- split: test
path: tmmluplus-macroeconomics/test-*
- split: dev
path: tmmluplus-macroeconomics/dev-*
- config_name: tmmluplus-management_accounting
data_files:
- split: test
path: tmmluplus-management_accounting/test-*
- split: dev
path: tmmluplus-management_accounting/dev-*
- config_name: tmmluplus-marketing_management
data_files:
- split: test
path: tmmluplus-marketing_management/test-*
- split: dev
path: tmmluplus-marketing_management/dev-*
- config_name: tmmluplus-mechanical
data_files:
- split: test
path: tmmluplus-mechanical/test-*
- split: dev
path: tmmluplus-mechanical/dev-*
- config_name: tmmluplus-music
data_files:
- split: test
path: tmmluplus-music/test-*
- split: dev
path: tmmluplus-music/dev-*
- config_name: tmmluplus-national_protection
data_files:
- split: test
path: tmmluplus-national_protection/test-*
- split: dev
path: tmmluplus-national_protection/dev-*
- config_name: tmmluplus-nautical_science
data_files:
- split: test
path: tmmluplus-nautical_science/test-*
- split: dev
path: tmmluplus-nautical_science/dev-*
- config_name: tmmluplus-occupational_therapy_for_psychological_disorders
data_files:
- split: test
path: tmmluplus-occupational_therapy_for_psychological_disorders/test-*
- split: dev
path: tmmluplus-occupational_therapy_for_psychological_disorders/dev-*
- config_name: tmmluplus-official_document_management
data_files:
- split: test
path: tmmluplus-official_document_management/test-*
- split: dev
path: tmmluplus-official_document_management/dev-*
- config_name: tmmluplus-optometry
data_files:
- split: test
path: tmmluplus-optometry/test-*
- split: dev
path: tmmluplus-optometry/dev-*
- config_name: tmmluplus-organic_chemistry
data_files:
- split: test
path: tmmluplus-organic_chemistry/test-*
- split: dev
path: tmmluplus-organic_chemistry/dev-*
- config_name: tmmluplus-pharmacology
data_files:
- split: test
path: tmmluplus-pharmacology/test-*
- split: dev
path: tmmluplus-pharmacology/dev-*
- config_name: tmmluplus-pharmacy
data_files:
- split: test
path: tmmluplus-pharmacy/test-*
- split: dev
path: tmmluplus-pharmacy/dev-*
- config_name: tmmluplus-physical_education
data_files:
- split: test
path: tmmluplus-physical_education/test-*
- split: dev
path: tmmluplus-physical_education/dev-*
- config_name: tmmluplus-physics
data_files:
- split: test
path: tmmluplus-physics/test-*
- split: dev
path: tmmluplus-physics/dev-*
- config_name: tmmluplus-politic_science
data_files:
- split: test
path: tmmluplus-politic_science/test-*
- split: dev
path: tmmluplus-politic_science/dev-*
- config_name: tmmluplus-real_estate
data_files:
- split: test
path: tmmluplus-real_estate/test-*
- split: dev
path: tmmluplus-real_estate/dev-*
- config_name: tmmluplus-secondary_physics
data_files:
- split: test
path: tmmluplus-secondary_physics/test-*
- split: dev
path: tmmluplus-secondary_physics/dev-*
- config_name: tmmluplus-statistics_and_machine_learning
data_files:
- split: test
path: tmmluplus-statistics_and_machine_learning/test-*
- split: dev
path: tmmluplus-statistics_and_machine_learning/dev-*
- config_name: tmmluplus-taiwanese_hokkien
data_files:
- split: test
path: tmmluplus-taiwanese_hokkien/test-*
- split: dev
path: tmmluplus-taiwanese_hokkien/dev-*
- config_name: tmmluplus-taxation
data_files:
- split: test
path: tmmluplus-taxation/test-*
- split: dev
path: tmmluplus-taxation/dev-*
- config_name: tmmluplus-technical
data_files:
- split: test
path: tmmluplus-technical/test-*
- split: dev
path: tmmluplus-technical/dev-*
- config_name: tmmluplus-three_principles_of_people
data_files:
- split: test
path: tmmluplus-three_principles_of_people/test-*
- split: dev
path: tmmluplus-three_principles_of_people/dev-*
- config_name: tmmluplus-trade
data_files:
- split: test
path: tmmluplus-trade/test-*
- split: dev
path: tmmluplus-trade/dev-*
- config_name: tmmluplus-traditional_chinese_medicine_clinical_medicine
data_files:
- split: test
path: tmmluplus-traditional_chinese_medicine_clinical_medicine/test-*
- split: dev
path: tmmluplus-traditional_chinese_medicine_clinical_medicine/dev-*
- config_name: tmmluplus-trust_practice
data_files:
- split: test
path: tmmluplus-trust_practice/test-*
- split: dev
path: tmmluplus-trust_practice/dev-*
- config_name: tmmluplus-ttqav2
data_files:
- split: test
path: tmmluplus-ttqav2/test-*
- split: dev
path: tmmluplus-ttqav2/dev-*
- config_name: tmmluplus-tve_chinese_language
data_files:
- split: test
path: tmmluplus-tve_chinese_language/test-*
- split: dev
path: tmmluplus-tve_chinese_language/dev-*
- config_name: tmmluplus-tve_design
data_files:
- split: test
path: tmmluplus-tve_design/test-*
- split: dev
path: tmmluplus-tve_design/dev-*
- config_name: tmmluplus-tve_mathematics
data_files:
- split: test
path: tmmluplus-tve_mathematics/test-*
- split: dev
path: tmmluplus-tve_mathematics/dev-*
- config_name: tmmluplus-tve_natural_sciences
data_files:
- split: test
path: tmmluplus-tve_natural_sciences/test-*
- split: dev
path: tmmluplus-tve_natural_sciences/dev-*
- config_name: tmmluplus-veterinary_pathology
data_files:
- split: test
path: tmmluplus-veterinary_pathology/test-*
- split: dev
path: tmmluplus-veterinary_pathology/dev-*
- config_name: tmmluplus-veterinary_pharmacology
data_files:
- split: test
path: tmmluplus-veterinary_pharmacology/test-*
- split: dev
path: tmmluplus-veterinary_pharmacology/dev-*
---
# TCEval v2
TCEval-v2 is a Traditional Chinese evaluation suite for foundation models derived from TCEval-v1. It covers 5 capabilities, including contextual QA, knowledge, classification, and table understanding.
## Benchmark
- **Contextual QA**
- **drcd** : DRCD is a Traditional Chinese machine reading comprehension dataset containing 10,014 paragraphs from 2,108 Wikipedia articles and over 30,000 questions.
- **Knowledge**
- **tmmluplus** (provided by MediaTek Research and iKala): Taiwan Massive Multitask Language Understanding + (TMMLU+) is curated from examinations in Taiwan, consisting of 67 subjects spanning across multiple disciplines, from vocational to academic fields, and covering elementary to professional proficiency levels. It is designed to identify a model’s knowledge and problem-solving blind spots similar to human evaluations. It is categorized into STEM, humanties, social sciences and other (similar to MMLU), for a higher level overview of the model capabilities.
- **Table Understanding**
- **penguin_table** (translate from a subset of [BIG-Bench](https://github.com/google/BIG-bench/tree/main/bigbench/benchmark_tasks/penguins_in_a_table)): The “penguins in a table” task contained in BIG-bench asks a language model to answer questions about the animals contained in a table, or multiple tables, described in the context.
- **Chat and instruction following**
- **mt_bench_tw** (translated from [MT Bench](https://huggingface.co/spaces/lmsys/mt-bench)): MT-Bench-TW is a Traditional Chinese version of MT-bench, which is a series of open-ended questions that evaluate a chatbot’s multi-turn conversational and instruction-following ability. MT-Bench-TW inherits the categorization of MT-Bench, which includes a wide variety of core capabilities, such as reasoning and writing.
If you find the dataset useful in your work, please cite:
```
@misc{hsu2023advancing,
title={Advancing the Evaluation of Traditional Chinese Language Models: Towards a Comprehensive Benchmark Suite},
author={Chan-Jan Hsu and Chang-Le Liu and Feng-Ting Liao and Po-Chun Hsu and Yi-Chang Chen and Da-shan Shiu},
year={2023},
eprint={2309.08448},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
ixelszy/FingeringTI | ---
license: creativeml-openrail-m
---
|
one-sec-cv12/chunk_257 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20717169408.0
num_examples: 215696
download_size: 18802186566
dataset_size: 20717169408.0
---
# Dataset Card for "chunk_257"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
enoahjr/twitter_dataset_1713226351 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 83499
num_examples: 215
download_size: 40250
dataset_size: 83499
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
9wimu9/wiki_support_docs_en | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: support_documents
sequence: string
splits:
- name: train
num_bytes: 793329594
num_examples: 170646
download_size: 479804174
dataset_size: 793329594
---
# Dataset Card for "wiki_support_docs_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xx103/NYC_Motor_Vehicle_Collisions_and_Weather_Dataset | ---
license: unknown
language:
- en
tags:
- collisions
- weather
- motor vehicle
size_categories:
- 1M<n<10M
---
# Dataset Card for NYC Motor Vehicle Collisions and Weather Dataset
## Dataset Description
- **Homepage:**
Homepage for raw data:
- **NYC Motor Vehicle Collisions Data (2.3GB, 2,061,947 observations):** [View Dataset](https://data.cityofnewyork.us/Public-Safety/Motor-Vehicle-Collisions-Crashes/h9gi-nx95/about_data)
- **NYC Daily Weather Data from 2013 to 2023 (4.2MB, 4,016 observations):** [View Dataset](https://www.visualcrossing.com/weather/weather-data-services/new%20york%20city/metric/2013-01-01/2023-12-31)
- **NYC Borough Data (23.0KB, 245 observations):** [View Dataset](https://catalog.data.gov/dataset/nyc-domain-registrations-by-zip-code)
The NYC Motor Vehicle Collisions and Weather Dataset aims to merge the NYC Motor Vehicle Collisions Data, the NYC Daily Weather Data, and the NYC Borough Data into a single, coherent dataset. This integration will incorporate borough information for each zip code in New York City and enable detailed analysis of the impact of weather conditions on the day of each collision. Such an approach not only facilitates comprehensive collision-weather analysis but also enhances the understanding of collision patterns across different boroughs, offering valuable insights for both dimensions of study.
### Dataset Summary
The NYC Motor Vehicle Collisions and Weather dataset, sourced from NYC Open Data and Visualcrossing, provides a comprehensive overview of police-reported motor vehicle collisions in boroughs of New York City, including the Bronx, Brooklyn, Manhattan, Queens, and Staten Island from 2013 to 2023. This dataset includes detailed information such as crash time period, crash date, collision ID, borough, zip code, and precise latitude and longitude coordinates. Each entry also specifies the street name, street type, and the number of persons injured or killed. Additionally, the dataset encompasses the contributing factors for each vehicle involved, the types of vehicles in the collisions, as well as the temperature, precipitation, precipitation type, weather descriptions in NYC on the dates when the collisions occurred.
### Supported Tasks
Here are some key tasks that can be conducted using this dataset:
- **Time Series Analysis:** Analyze trends over time in motor vehicle collisions, including fluctuations in the number of accidents, injuries, and fatalities annually or seasonally.
- **Geospatial Analysis:** Utilize the latitude and longitude data to map collision locations, identifying hotspots or boroughs with higher frequencies of incidents.
- **Statistical Correlation and Causation Studies:** Investigate potential correlations between collision rates and various factors like time of day, weather conditions, traffic patterns(type of street), specific locations (boroughs or zip codes), vehicle types.
- **Machine Learning Predictive Models:** Develop predictive models to forecast the likelihood of collisions in certain areas or under specific conditions, aiding in preventive measures.
### Languages
English
## Dataset Structure
### Data Instances
```json
{
"crash_date": "2021-12-14",
"borough": "BROOKLYN",
"zip_code": "11211",
"latitude": 40.70918273925781,
"longitude": -73.95682525634766,
"collision_id": 4486555,
"crash_time_period": "15:00-17:59",
"contributing_factor_vehicles": ["Passing Too Closely", "Unspecified"],
"vehicle_types": ["Sedan", "Tractor Truck Diesel"],
"number_of_injuries": 0,
"number_of_deaths": 0,
"street_name": "BROOKLYN QUEENS EXPRESSWAY",
"street_type": "ON STREET",
"weather_description": "Clear conditions throughout the day.",
"precipitation": 0.0,
"precipitation_type": null,
"temp_max": 11.9,
"temp_min": 6.8
}
```
### Data Fields
- **`Crash Date`[Date]:** Occurrence date of collision.
- **`Borough`[string]:** Borough where collision occurred.
- **`Zip Code`[string]:** Postal code of incident occurrence.
- **`Latitude`[float]:** Latitude coordinate for Global Coordinate System.
- **`Longitude`[float]:** Longitude coordinate for Global Coordinate System.
- **`Collision ID`[integer]:** Unique record code generated by system. Primary Key for Collision table.
- **`Crash Time Period`[string]:** Classification of crash times into predefined intervals, such as 0:00-2:59, spanning 8 distinct time periods throughout the day.
- **`Street Name`[string]:** Street on which the collision occurred.
- **`Street Type`[string]:** On Street /Cross Street/ Off Street.
- **`Contributing Factors`[string]:** Factors contributing to the collision.
- **`Vehicle Types`[string]:** Type of vehicles involved in collision.
- **`Weather Description`[string]:** The weather conditions when collision occurred.
- **`Number of Injured`[integer]:** Number of people injured in the specified collision incident.
- **`Number of Death`[integer]:** Number of cyclists killed in the specified collision incident.
- **`Precipitation`[float]:** The amount of precipitation that fell or is predicted to fall in millimeters when collision occurred.
- **`Precipitation Type`[string]:** rain, snow, both, or none.
- **`Maximum Temperature`[float]:** the maximum temperature in degree Fahrenheit when collision occurred.
- **`Minimum Temperature`[float]:** the minimum temperature in degree Fahrenheit when collision occurred.
## Dataset Creation
### Curation Rationale
This dataset is curated to shed light on the impact of borough and weather on road safety. It enables a comprehensive analysis of how weather variations and locations influence the frequency and severity of collisions. In addition, it offers insights for enhancing urban planning and road safety measures and serves as a critical tool for conducting time series analysis, geospatial mapping, and statistical studies to identify trends and hotspots. Furthermore, it lays the groundwork for developing predictive models through machine learning, aiming to forecast collision occurrences under specific weather conditions. Ultimately, this dataset aspires to be a cornerstone for data-driven strategies to mitigate traffic-related incidents, bolstering efforts towards safer urban environments.
### Source Data
- **NYC Motor Vehicle Collisions Data**, provided by the New York City Police Department (NYPD), is available on the NYC Open Data platform.
- **NYC Daily Weather Data**, provided by Visualcrossing, is sourced from a variety of reputable historical weather data sources, including the Integrated Surface Database for global sub-hourly and hourly observations, MADIS with its extensive meteorological data like METAR, Integrated Mesonet Data, maritime data, and snow data from SNOTEL, the German Weather Service's (DWD) comprehensive database, the Global Historical Climate Network Daily (GHCN-D) for daily summaries, and sophisticated reanalysis data from ECMWF's ERA5 and NASA's MERRA-2.
- **NYC Borough Data**, provided by the Government of New York City, is available on the NYC Open Data Platform.
### Personal and Sensitive Information
Care has been taken to ensure that the dataset does not include direct personal or sensitive information about individuals involved in the collisions. While the dataset provides detailed geographic coordinates of collisions, it does not include names, addresses, or any other information that could be used to identify individuals involved. Users of the dataset are urged to follow ethical guidelines and privacy laws when analyzing or sharing insights derived from this data.
## Considerations for Using the Data
### Social Impact of Dataset
The NYC Motor Vehicle Collisions and Weather Dataset, a fusion of NYPD's collision data, NYC government’s borough data, and Visualcrossing's weather insights, offers a vital resource for understanding the interplay between weather conditions and road safety. Its comprehensive analysis potential enables urban planners and researchers to devise strategies aimed at reducing traffic incidents, thereby enhancing public safety. This dataset represents a significant step towards a more data-informed approach in urban safety and planning, while maintaining a strong commitment to ethical data use and privacy.
### Other Known Limitations
1. **Incomplete Geographical Data**: A notable limitation of this dataset is the occasional absence of key geographical details such as zip codes, geocodes, borough names, or specific street types (on-street, cross street, off-street). This missing information can hinder the accuracy of geospatial analyses and may lead to an incomplete understanding of collision distributions and patterns within the city.
2. **Unspecified Contributing Factors**: The dataset sometimes lacks specificity in detailing the contributing factors for vehicle collisions. Instances where these factors are labeled as 'unspecified' or are missing can lead to challenges in accurately determining the causes of accidents. This lack of detail may impact studies focused on understanding and mitigating the root causes of collisions.
3. **Generalized Weather Data**: The weather data included is based on daily records, which might not precisely reflect the weather conditions at the exact time of each collision. This temporal mismatch can introduce biases in analyses that aim to correlate specific weather conditions with the occurrence of road incidents. As a result, conclusions drawn about the impact of weather on collision rates and severity might be less accurate or comprehensive.
## Additional Information
### Contributions
This dataset was made possible through the invaluable contributions of NYC Open Data and the New York City Police Department (NYPD), providing extensive collision and borough data, and Visual Crossing, for their comprehensive weather data. I extend my deepest gratitude to these organizations for their pivotal role in enabling this research and for their commitment to open data accessibility. |
open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test | ---
pretty_name: Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T23:08:06.310382](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test/blob/main/results_2024-03-09T23-08-06.310382.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538249490414755,\n\
\ \"acc_stderr\": 0.03205307034724896,\n \"acc_norm\": 0.6534435010472049,\n\
\ \"acc_norm_stderr\": 0.03272118621923929,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.77524154156829,\n\
\ \"mc2_stderr\": 0.013791360215680813\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068745,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7155945030870344,\n\
\ \"acc_stderr\": 0.004502088287470137,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.003111795320787943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.77524154156829,\n\
\ \"mc2_stderr\": 0.013791360215680813\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272956\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \
\ \"acc_stderr\": 0.01275737537675494\n }\n}\n```"
repo_url: https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|winogrande|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T23-08-06.310382.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- results_2024-03-09T23-08-06.310382.parquet
- split: latest
path:
- results_2024-03-09T23-08-06.310382.parquet
---
# Dataset Card for Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T23:08:06.310382](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test/blob/main/results_2024-03-09T23-08-06.310382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538249490414755,
"acc_stderr": 0.03205307034724896,
"acc_norm": 0.6534435010472049,
"acc_norm_stderr": 0.03272118621923929,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.77524154156829,
"mc2_stderr": 0.013791360215680813
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068745,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710695
},
"harness|hellaswag|10": {
"acc": 0.7155945030870344,
"acc_stderr": 0.004502088287470137,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268584,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.77524154156829,
"mc2_stderr": 0.013791360215680813
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272956
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.01275737537675494
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AneeqMalik/Enter-Your-hub-name | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: audio_names
dtype: string
- name: genere
dtype:
class_label:
names:
'0': bad
'1': okay
'2': good
'3': great
splits:
- name: train
num_bytes: 12388426.0
num_examples: 6
download_size: 12391275
dataset_size: 12388426.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Enter-Your-hub-name"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Niche-Squad/balloon | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: int64
- name: filename
dtype: string
- name: annotations
sequence:
- name: id
dtype: int64
- name: image_id
dtype: int64
- name: category_id
dtype: int64
- name: iscrowd
dtype: int64
- name: area
dtype: float64
- name: bbox
sequence: float64
length: 4
- name: segmentation
sequence:
sequence: int64
splits:
- name: train
num_bytes: 23704472.0
num_examples: 47
- name: validation
num_bytes: 6532081.0
num_examples: 11
- name: test
num_bytes: 8824061.0
num_examples: 16
download_size: 38896408
dataset_size: 39060614.0
---
# Dataset Card for "balloon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Winterda/TQHWYNCGSE | ---
license: cc
---
|
thiaguinho/bell | ---
license: other
license_name: bell
license_link: LICENSE
---
|
arthurneuron/cryptocurrency-futures-ohlcv-dataset-1m | ---
license: mit
---
|
FaalSa/data10 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 17310
num_examples: 1
- name: validation
num_bytes: 17790
num_examples: 1
- name: test
num_bytes: 18270
num_examples: 1
download_size: 9510
dataset_size: 53370
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ypilseong/deepnlp_autotrain_Empathy_chat_data | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30 | ---
pretty_name: Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/Mistral-7B-Instruct-v0.2-sparsity-30](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T11:11:23.952137](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30/blob/main/results_2024-01-15T11-11-23.952137.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46463869066130487,\n\
\ \"acc_stderr\": 0.034455801387647846,\n \"acc_norm\": 0.4711053080225253,\n\
\ \"acc_norm_stderr\": 0.035249688625421514,\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.4553313356020083,\n\
\ \"mc2_stderr\": 0.01500792603148901\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45563139931740615,\n \"acc_stderr\": 0.014553749939306864,\n\
\ \"acc_norm\": 0.5110921501706485,\n \"acc_norm_stderr\": 0.014607794914013048\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5652260505875324,\n\
\ \"acc_stderr\": 0.004947141797384131,\n \"acc_norm\": 0.7572196773551085,\n\
\ \"acc_norm_stderr\": 0.004278871104930366\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171451,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171451\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261135,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.034234651001042844,\n\
\ \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.034234651001042844\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6146788990825688,\n \"acc_stderr\": 0.020865850852794122,\n \"\
acc_norm\": 0.6146788990825688,\n \"acc_norm_stderr\": 0.020865850852794122\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5980392156862745,\n \"acc_stderr\": 0.03441190023482465,\n \"\
acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.03441190023482465\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5991561181434599,\n \"acc_stderr\": 0.031900803894732356,\n \
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.031900803894732356\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.03919415545048409,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.03919415545048409\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857144,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857144\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n\
\ \"acc_stderr\": 0.03078232157768817,\n \"acc_norm\": 0.6709401709401709,\n\
\ \"acc_norm_stderr\": 0.03078232157768817\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6347381864623244,\n\
\ \"acc_stderr\": 0.01721853002883864,\n \"acc_norm\": 0.6347381864623244,\n\
\ \"acc_norm_stderr\": 0.01721853002883864\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.026915047355369804,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.026915047355369804\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5401929260450161,\n\
\ \"acc_stderr\": 0.028306190403305696,\n \"acc_norm\": 0.5401929260450161,\n\
\ \"acc_norm_stderr\": 0.028306190403305696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33376792698826596,\n\
\ \"acc_stderr\": 0.012043812655846142,\n \"acc_norm\": 0.33376792698826596,\n\
\ \"acc_norm_stderr\": 0.012043812655846142\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4297385620915033,\n \"acc_stderr\": 0.02002712278492854,\n \
\ \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.02002712278492854\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.03765845117168862,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.03765845117168862\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6257309941520468,\n \"acc_stderr\": 0.03711601185389481,\n\
\ \"acc_norm\": 0.6257309941520468,\n \"acc_norm_stderr\": 0.03711601185389481\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.4553313356020083,\n\
\ \"mc2_stderr\": 0.01500792603148901\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6898184688239937,\n \"acc_stderr\": 0.013000454144859893\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10538286580742987,\n \
\ \"acc_stderr\": 0.00845757588404174\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|arc:challenge|25_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|gsm8k|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hellaswag|10_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T11-11-23.952137.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- '**/details_harness|winogrande|5_2024-01-15T11-11-23.952137.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T11-11-23.952137.parquet'
- config_name: results
data_files:
- split: 2024_01_15T11_11_23.952137
path:
- results_2024-01-15T11-11-23.952137.parquet
- split: latest
path:
- results_2024-01-15T11-11-23.952137.parquet
---
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-30](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T11:11:23.952137](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30/blob/main/results_2024-01-15T11-11-23.952137.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46463869066130487,
"acc_stderr": 0.034455801387647846,
"acc_norm": 0.4711053080225253,
"acc_norm_stderr": 0.035249688625421514,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.4553313356020083,
"mc2_stderr": 0.01500792603148901
},
"harness|arc:challenge|25": {
"acc": 0.45563139931740615,
"acc_stderr": 0.014553749939306864,
"acc_norm": 0.5110921501706485,
"acc_norm_stderr": 0.014607794914013048
},
"harness|hellaswag|10": {
"acc": 0.5652260505875324,
"acc_stderr": 0.004947141797384131,
"acc_norm": 0.7572196773551085,
"acc_norm_stderr": 0.004278871104930366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171451,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171451
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261135,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.034234651001042844,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.034234651001042844
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371216,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6146788990825688,
"acc_stderr": 0.020865850852794122,
"acc_norm": 0.6146788990825688,
"acc_norm_stderr": 0.020865850852794122
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.03441190023482465,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.03441190023482465
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.031900803894732356,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.031900803894732356
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.03919415545048409,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.03919415545048409
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857144,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857144
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.03078232157768817,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.03078232157768817
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6347381864623244,
"acc_stderr": 0.01721853002883864,
"acc_norm": 0.6347381864623244,
"acc_norm_stderr": 0.01721853002883864
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.026915047355369804,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.026915047355369804
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5401929260450161,
"acc_stderr": 0.028306190403305696,
"acc_norm": 0.5401929260450161,
"acc_norm_stderr": 0.028306190403305696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33376792698826596,
"acc_stderr": 0.012043812655846142,
"acc_norm": 0.33376792698826596,
"acc_norm_stderr": 0.012043812655846142
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.02002712278492854,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.02002712278492854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.03765845117168862,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.03765845117168862
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6257309941520468,
"acc_stderr": 0.03711601185389481,
"acc_norm": 0.6257309941520468,
"acc_norm_stderr": 0.03711601185389481
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.4553313356020083,
"mc2_stderr": 0.01500792603148901
},
"harness|winogrande|5": {
"acc": 0.6898184688239937,
"acc_stderr": 0.013000454144859893
},
"harness|gsm8k|5": {
"acc": 0.10538286580742987,
"acc_stderr": 0.00845757588404174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Lolz14/DelFin | ---
license: apache-2.0
---
|
napatswift/thbud-doc-ocr | ---
dataset_info:
features:
- name: words
sequence: string
- name: norm_bboxes
sequence:
sequence: float64
- name: ner_tags
sequence: 'null'
- name: class
dtype:
class_label:
names:
'0': toc
'1': entry
'2': other
splits:
- name: train
num_bytes: 6887148
num_examples: 1078
download_size: 2658905
dataset_size: 6887148
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "thbud-doc-ocr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_sent_before_sent_train_400_eval_40_random_permute_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 3060300.2129436326
num_examples: 2434
- name: validation
num_bytes: 232610
num_examples: 200
download_size: 1042600
dataset_size: 3292910.2129436326
---
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_object_pronoun_drop | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 448107
num_examples: 2314
- name: test
num_bytes: 4661823
num_examples: 24104
- name: train
num_bytes: 4189364
num_examples: 21369
download_size: 5780402
dataset_size: 9299294
---
# Dataset Card for "MULTI_VALUE_qqp_object_pronoun_drop"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FitzPC/HqDC-1.4M | ---
license: apache-2.0
---
|
JTBTechnology/taoyuan_travel_qa | ---
language:
- zh
- en
- ja
- ko
- id
- vi
- th
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- translation
pretty_name: traverl_qa
dataset_info:
- config_name: en-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 385263
num_examples: 2256
download_size: 124124
dataset_size: 385263
- config_name: id-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 407538
num_examples: 2256
download_size: 124778
dataset_size: 407538
- config_name: ja-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 441775
num_examples: 2256
download_size: 135723
dataset_size: 441775
- config_name: ko-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 414010
num_examples: 2256
download_size: 132364
dataset_size: 414010
- config_name: th-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 569121
num_examples: 2256
download_size: 166608
dataset_size: 569121
- config_name: vi-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 444668
num_examples: 2256
download_size: 138816
dataset_size: 444668
configs:
- config_name: en-zh_tw
data_files:
- split: train
path: en-zh_tw/train-*
- config_name: id-zh_tw
data_files:
- split: train
path: id-zh_tw/train-*
- config_name: ja-zh_tw
data_files:
- split: train
path: ja-zh_tw/train-*
- config_name: ko-zh_tw
data_files:
- split: train
path: ko-zh_tw/train-*
- config_name: th-zh_tw
data_files:
- split: train
path: th-zh_tw/train-*
- config_name: vi-zh_tw
data_files:
- split: train
path: vi-zh_tw/train-*
tags:
- 台灣
- 桃園捷運
- 旅遊
---
# 桃園捷運旅遊問答翻譯資料集
本專案包含六國語言,中英日韓印越泰
全程由語言模型 (GPT4) 產生合成數據(synthesis data),每一組語言存在 2K 資料量。 |
atmallen/quirky_addition_increment3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 6755254
num_examples: 100000
- name: validation
num_bytes: 675054
num_examples: 10000
- name: test
num_bytes: 675338
num_examples: 10000
download_size: 1898397
dataset_size: 8105646
---
# Dataset Card for "quirky_addition_increment3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FeiVw/umatest | ---
license: mit
---
|
esc-benchmark/esc-datasets | ---
annotations_creators:
- expert-generated
- crowdsourced
- machine-generated
language:
- en
language_creators:
- crowdsourced
- expert-generated
license:
- cc-by-4.0
- apache-2.0
- cc0-1.0
- cc-by-nc-3.0
- other
multilinguality:
- monolingual
pretty_name: esc-datasets
size_categories:
- 100K<n<1M
- 1M<n<10M
source_datasets:
- original
- extended|librispeech_asr
- extended|common_voice
tags:
- asr
- benchmark
- speech
- esc
task_categories:
- automatic-speech-recognition
task_ids: []
extra_gated_prompt: |-
Three of the ESC datasets have specific terms of usage that must be agreed to before using the data.
To do so, fill in the access forms on the specific datasets' pages:
* Common Voice: https://huggingface.co/datasets/mozilla-foundation/common_voice_9_0
* GigaSpeech: https://huggingface.co/datasets/speechcolab/gigaspeech
* SPGISpeech: https://huggingface.co/datasets/kensho/spgispeech
extra_gated_fields:
I hereby confirm that I have registered on the original Common Voice page and agree to not attempt to determine the identity of speakers in the Common Voice dataset: checkbox
I hereby confirm that I have accepted the terms of usages on GigaSpeech page: checkbox
I hereby confirm that I have accepted the terms of usages on SPGISpeech page: checkbox
---
All eight of datasets in ESC can be downloaded and prepared in just a single line of code through the Hugging Face Datasets library:
```python
from datasets import load_dataset
librispeech = load_dataset("esc-benchmark/esc-datasets", "librispeech", split="train")
```
- `"esc-benchmark"`: the repository namespace. This is fixed for all ESC datasets.
- `"librispeech"`: the dataset name. This can be changed to any of any one of the eight datasets in ESC to download that dataset.
- `split="train"`: the split. Set this to one of train/validation/test to generate a specific split. Omit the `split` argument to generate all splits for a dataset.
The datasets are full prepared, such that the audio and transcription files can be used directly in training/evaluation scripts.
## Dataset Information
A data point can be accessed by indexing the dataset object loaded through `load_dataset`:
```python
print(librispeech[0])
```
A typical data point comprises the path to the audio file and its transcription. Also included is information of the dataset from which the sample derives and a unique identifier name:
```python
{
'dataset': 'librispeech',
'audio': {'path': '/home/esc-bencher/.cache/huggingface/datasets/downloads/extracted/d2da1969fe9e7d06661b5dc370cf2e3c119a14c35950045bcb76243b264e4f01/374-180298-0000.flac',
'array': array([ 7.01904297e-04, 7.32421875e-04, 7.32421875e-04, ...,
-2.74658203e-04, -1.83105469e-04, -3.05175781e-05]),
'sampling_rate': 16000},
'text': 'chapter sixteen i might have told you of the beginning of this liaison in a few lines but i wanted you to see every step by which we came i to agree to whatever marguerite wished',
'id': '374-180298-0000'
}
```
### Data Fields
- `dataset`: name of the ESC dataset from which the sample is taken.
- `audio`: a dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate.
- `text`: the transcription of the audio file.
- `id`: unique id of the data sample.
### Data Preparation
#### Audio
The audio for all ESC datasets is segmented into sample lengths suitable for training ASR systems. The Hugging Face datasets library decodes audio files on the fly, reading the segments and converting them to a Python arrays. Consequently, no further preparation of the audio is required to be used in training/evaluation scripts.
Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, i.e. `dataset[0]["audio"]` should always be preferred over `dataset["audio"][0]`.
#### Transcriptions
The transcriptions corresponding to each audio file are provided in their 'error corrected' format. No transcription pre-processing is applied to the text, only necessary 'error correction' steps such as removing junk tokens (_<unk>_) or converting symbolic punctuation to spelled out form (_<comma>_ to _,_). As such, no further preparation of the transcriptions is required to be used in training/evaluation scripts.
Transcriptions are provided for training and validation splits. The transcriptions are **not** provided for the test splits. The ESC benchmark requires you to generate predictions for the test sets and upload them to https://huggingface.co/spaces/esc-benchmark/esc for scoring.
### Access
All eight of the datasets in ESC are accessible and licensing is freely available. Three of the ESC datasets have specific terms of usage that must be agreed to before using the data. To do so, fill in the access forms on the specific datasets' pages:
* Common Voice: https://huggingface.co/datasets/mozilla-foundation/common_voice_9_0
* GigaSpeech: https://huggingface.co/datasets/speechcolab/gigaspeech
* SPGISpeech: https://huggingface.co/datasets/kensho/spgispeech
## LibriSpeech
The LibriSpeech corpus is a standard large-scale corpus for assessing ASR systems. It consists of approximately 1,000 hours of narrated audiobooks from the [LibriVox](https://librivox.org) project. It is licensed under CC-BY-4.0.
Example Usage:
```python
librispeech = load_dataset("esc-benchmark/esc-datasets", "librispeech")
```
Train/validation splits:
- `train` (combination of `train.clean.100`, `train.clean.360` and `train.other.500`)
- `validation.clean`
- `validation.other`
Test splits:
- `test.clean`
- `test.other`
Also available are subsets of the train split, which can be accessed by setting the `subconfig` argument:
```python
librispeech = load_dataset("esc-benchmark/esc-datasets", "librispeech", subconfig="clean.100")
```
- `clean.100`: 100 hours of training data from the 'clean' subset
- `clean.360`: 360 hours of training data from the 'clean' subset
- `other.500`: 500 hours of training data from the 'other' subset
## Common Voice
Common Voice is a series of crowd-sourced open-licensed speech datasets where speakers record text from Wikipedia in various languages. The English subset of contains approximately 1,400 hours of audio data from speakers of various nationalities, accents and different recording conditions. It is licensed under CC0-1.0.
Example usage:
```python
common_voice = load_dataset("esc-benchmark/esc-datasets", "common_voice", use_auth_token=True)
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## VoxPopuli
VoxPopuli s a large-scale multilingual speech corpus consisting of political data sourced from 2009-2020 European Parliament event recordings. The English subset contains approximately 550 hours of speech largely from non-native English speakers. It is licensed under CC0.
Example usage:
```python
voxpopuli = load_dataset("esc-benchmark/esc-datasets", "voxpopuli")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## TED-LIUM
TED-LIUM consists of English-language TED Talk conference videos covering a range of different cultural, political, and academic topics. It contains approximately 450 hours of transcribed speech data. It is licensed under CC-BY-NC-ND 3.0.
Example usage:
```python
tedlium = load_dataset("esc-benchmark/esc-datasets", "tedlium")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## GigaSpeech
GigaSpeech is a multi-domain English speech recognition corpus created from audiobooks, podcasts and YouTube. We provide the large train set (2,500 hours) and the standard validation and test splits. It is licensed under apache-2.0.
Example usage:
```python
gigaspeech = load_dataset("esc-benchmark/esc-datasets", "gigaspeech", use_auth_token=True)
```
Training/validation splits:
- `train` (`l` subset of training data (2,500 h))
- `validation`
Test splits:
- `test`
Also available are subsets of the train split, which can be accessed by setting the `subconfig` argument:
```python
gigaspeech = load_dataset("esc-benchmark/esc-datasets", "spgispeech", subconfig="xs", use_auth_token=True)
```
- `xs`: extra-small subset of training data (10 h)
- `s`: small subset of training data (250 h)
- `m`: medium subset of training data (1,000 h)
- `xl`: extra-large subset of training data (10,000 h)
## SPGISpeech
SPGISpeech consists of company earnings calls that have been manually transcribed by S&P Global, Inc according to a professional style guide. We provide the large train set (5,000 hours) and the standard validation and test splits. It is licensed under a Kensho user agreement.
Loading the dataset requires authorization.
Example usage:
```python
spgispeech = load_dataset("esc-benchmark/esc-datasets", "spgispeech", use_auth_token=True)
```
Training/validation splits:
- `train` (`l` subset of training data (~5,000 h))
- `validation`
Test splits:
- `test`
Also available are subsets of the train split, which can be accessed by setting the `subconfig` argument:
```python
spgispeech = load_dataset("esc-benchmark/esc-datasets", "spgispeech", subconfig="s", use_auth_token=True)
```
- `s`: small subset of training data (~200 h)
- `m`: medium subset of training data (~1,000 h)
## Earnings-22
Earnings-22 is a 119-hour corpus of English-language earnings calls collected from global companies, with speakers of many different nationalities and accents. It is licensed under CC-BY-SA-4.0.
Example usage:
```python
earnings22 = load_dataset("esc-benchmark/esc-datasets", "earnings22")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## AMI
The AMI Meeting Corpus consists of 100 hours of meeting recordings from multiple recording devices synced to a common timeline. It is licensed under CC-BY-4.0.
Example usage:
```python
ami = load_dataset("esc-benchmark/esc-datasets", "ami")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
|
ibm/argument_quality_ranking_30k | ---
license: cc-by-3.0
task_categories:
- text-classification
language:
- en
pretty_name: Argument-Quality-Ranking-30k
size_categories:
- 10K<n<100K
configs:
- config_name: argument_quality_ranking
data_files:
- split: train
path: "train.csv"
- split: validation
path: "dev.csv"
- split: test
path: "test.csv"
- config_name: argument_topic
data_files:
- split: train
path: "train_topic.csv"
- split: validation
path: "dev_topic.csv"
- split: test
path: "test_topic.csv"
---
# Dataset Card for Argument-Quality-Ranking-30k Dataset
## Table of Contents
- [Dataset Summary](#dataset-summary)
- [Argument Quality Ranking](#argument-quality-ranking)
- [Argument Topic](#argument-topic)
- [Dataset Collection](#dataset-collection)
- [Argument Collection](#argument-collection)
- [Quality and Stance Labeling](#quality-and-stance-labeling)
- [Dataset Structure](#dataset-structure)
- [Quality Labels](#quality-labels)
- [Stance Labels](#stance-labels)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Summary
### Argument Quality Ranking
The dataset contains 30,497 crowd-sourced arguments for 71 debatable topics labeled for quality and stance, split into train, validation and test sets.
The dataset was originally published as part of our paper: [A Large-scale Dataset for Argument Quality Ranking: Construction and Analysis](https://arxiv.org/abs/1911.11408).
### Argument Topic
This subset contains 9,487 of the arguments only with their topics with a different train-validation-test split. Usage of this subset TBA.
## Dataset Collection
### Argument Collection
For the purpose of collecting arguments for this dataset we conducted a crowd annotation task. We selected 71 common controversial topics for which arguments were collected (e.g., We should abolish capital punishment).
Annotators were presented with a single topic each time, and asked to contribute one supporting and one contesting argument for it, requiring arguments to be written using original language. To motivate high-quality contributions, contributors were informed they will receive extra payment for high quality arguments, as determined by the subsequent argument quality labeling task.
It was explained that an argument will be considered as a high quality one, if a person preparing a speech on the topic will be likely to use this argument as is in her speech.
We place a limit on argument length - a minimum of 35 characters and a maximum of 210 characters. In total, we collected 30,497 arguments from 280 contributors, each contributing no more than 6 arguments per topic.
### Quality and Stance Labeling
Annotators were presented with a binary question per argument, asking if they would recommend a friend to use that argument as is in a speech supporting/contesting the topic, regardless of personal opinion.
In addition, annotators were asked to mark the stance of the argument towards the topic (pro or con).
10 annotators labeled each instance.
## Dataset Structure
Each instance contains a string argument, a string topic, and quality and stance scores:
* WA - the quality label according to the weighted-average scoring function
* MACE-P - the quality label according to the MACE-P scoring function
* stance_WA - the stance label according to the weighted-average scoring function
* stance_WA_conf - the confidence in the stance label according to the weighted-average scoring function
### Quality Labels
For an explanation of the quality labels presented in columns WA and MACE-P, please see section 4 in the paper.
### Stance Labels
There were three possible annotations for the stance task: 1 (pro), -1 (con) and 0 (neutral). The stance_WA_conf column refers to the weighted-average score of the winning label. The stance_WA column refers to the winning stance label itself.
## Licensing Information
The datasets are released under the following licensing and copyright terms:
* (c) Copyright [Wikipedia](https://en.wikipedia.org/wiki/Wikipedia:Copyrights#Reusers.27_rights_and_obligations)
* (c) Copyright IBM 2014. Released under [CC-BY-SA 3.0](http://creativecommons.org/licenses/by-sa/3.0/)
## Citation Information
```
@article{DBLP:journals/corr/abs-1911-11408,
author = {Shai Gretz and
Roni Friedman and
Edo Cohen{-}Karlik and
Assaf Toledo and
Dan Lahav and
Ranit Aharonov and
Noam Slonim},
title = {A Large-scale Dataset for Argument Quality Ranking: Construction and
Analysis},
journal = {CoRR},
volume = {abs/1911.11408},
year = {2019},
url = {http://arxiv.org/abs/1911.11408},
eprinttype = {arXiv},
eprint = {1911.11408},
timestamp = {Tue, 03 Dec 2019 20:41:07 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1911-11408.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
open-llm-leaderboard/details_Steelskull__Lumosia-v2-MoE-4x10.7 | ---
pretty_name: Evaluation run of Steelskull/Lumosia-v2-MoE-4x10.7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Steelskull/Lumosia-v2-MoE-4x10.7](https://huggingface.co/Steelskull/Lumosia-v2-MoE-4x10.7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Steelskull__Lumosia-v2-MoE-4x10.7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T08:19:21.300026](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Lumosia-v2-MoE-4x10.7/blob/main/results_2024-02-02T08-19-21.300026.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6680685275478645,\n\
\ \"acc_stderr\": 0.0315547578178304,\n \"acc_norm\": 0.6687810115447916,\n\
\ \"acc_norm_stderr\": 0.032201365533529785,\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577617,\n \"mc2\": 0.6847502236527627,\n\
\ \"mc2_stderr\": 0.015252351834031837\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173307,\n\
\ \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.013340916085246252\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7024497112129058,\n\
\ \"acc_stderr\": 0.004562462665505232,\n \"acc_norm\": 0.8787094204341764,\n\
\ \"acc_norm_stderr\": 0.003257974593789941\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130733,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130733\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252255,\n \"\
acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252255\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033467,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063547,\n \
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.02907937453948001,\n \
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.02907937453948001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009246,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009246\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876168,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876168\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087866,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02313237623454334,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02313237623454334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623216,\n \
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577617,\n \"mc2\": 0.6847502236527627,\n\
\ \"mc2_stderr\": 0.015252351834031837\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6512509476876421,\n \
\ \"acc_stderr\": 0.013127227055035863\n }\n}\n```"
repo_url: https://huggingface.co/Steelskull/Lumosia-v2-MoE-4x10.7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|arc:challenge|25_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|gsm8k|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hellaswag|10_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-19-21.300026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T08-19-21.300026.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- '**/details_harness|winogrande|5_2024-02-02T08-19-21.300026.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T08-19-21.300026.parquet'
- config_name: results
data_files:
- split: 2024_02_02T08_19_21.300026
path:
- results_2024-02-02T08-19-21.300026.parquet
- split: latest
path:
- results_2024-02-02T08-19-21.300026.parquet
---
# Dataset Card for Evaluation run of Steelskull/Lumosia-v2-MoE-4x10.7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Steelskull/Lumosia-v2-MoE-4x10.7](https://huggingface.co/Steelskull/Lumosia-v2-MoE-4x10.7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Steelskull__Lumosia-v2-MoE-4x10.7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T08:19:21.300026](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Lumosia-v2-MoE-4x10.7/blob/main/results_2024-02-02T08-19-21.300026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6680685275478645,
"acc_stderr": 0.0315547578178304,
"acc_norm": 0.6687810115447916,
"acc_norm_stderr": 0.032201365533529785,
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577617,
"mc2": 0.6847502236527627,
"mc2_stderr": 0.015252351834031837
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173307,
"acc_norm": 0.7039249146757679,
"acc_norm_stderr": 0.013340916085246252
},
"harness|hellaswag|10": {
"acc": 0.7024497112129058,
"acc_stderr": 0.004562462665505232,
"acc_norm": 0.8787094204341764,
"acc_norm_stderr": 0.003257974593789941
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130733,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130733
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252255,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033467,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009246,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009246
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876168,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876168
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02313237623454334,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02313237623454334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623216,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577617,
"mc2": 0.6847502236527627,
"mc2_stderr": 0.015252351834031837
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.6512509476876421,
"acc_stderr": 0.013127227055035863
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
louisbrulenaudet/code-pensions-militaires-invalidite-victimes-guerre | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code des pensions militaires d'invalidité et des victimes de guerre
source_datasets:
- original
pretty_name: Code des pensions militaires d'invalidité et des victimes de guerre
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code des pensions militaires d'invalidité et des victimes de guerre, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
Ga88/Clovis | ---
license: openrail
---
|
open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B | ---
pretty_name: Evaluation run of mlabonne/NeuralMonarch-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/NeuralMonarch-7B](https://huggingface.co/mlabonne/NeuralMonarch-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T10:44:03.358725](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B/blob/main/results_2024-02-14T10-44-03.358725.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501327325533349,\n\
\ \"acc_stderr\": 0.032222664885814316,\n \"acc_norm\": 0.6497540751488936,\n\
\ \"acc_norm_stderr\": 0.03289485359002978,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7779478264166126,\n\
\ \"mc2_stderr\": 0.013764993545897771\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.01332975029338232,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136444\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n\
\ \"acc_stderr\": 0.0044958914405194205,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.0031117953207879436\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n\
\ \"acc_stderr\": 0.016060056268530336,\n \"acc_norm\": 0.8311926605504587,\n\
\ \"acc_norm_stderr\": 0.016060056268530336\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n\
\ \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n\
\ \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n\
\ \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n\
\ \"acc_stderr\": 0.012758410941038911,\n \"acc_norm\": 0.4784876140808344,\n\
\ \"acc_norm_stderr\": 0.012758410941038911\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7779478264166126,\n\
\ \"mc2_stderr\": 0.013764993545897771\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750038\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6777862016679302,\n \
\ \"acc_stderr\": 0.012872435481188776\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/NeuralMonarch-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|arc:challenge|25_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|gsm8k|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hellaswag|10_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T10-44-03.358725.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- '**/details_harness|winogrande|5_2024-02-14T10-44-03.358725.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T10-44-03.358725.parquet'
- config_name: results
data_files:
- split: 2024_02_14T10_44_03.358725
path:
- results_2024-02-14T10-44-03.358725.parquet
- split: latest
path:
- results_2024-02-14T10-44-03.358725.parquet
---
# Dataset Card for Evaluation run of mlabonne/NeuralMonarch-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralMonarch-7B](https://huggingface.co/mlabonne/NeuralMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T10:44:03.358725](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B/blob/main/results_2024-02-14T10-44-03.358725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501327325533349,
"acc_stderr": 0.032222664885814316,
"acc_norm": 0.6497540751488936,
"acc_norm_stderr": 0.03289485359002978,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7779478264166126,
"mc2_stderr": 0.013764993545897771
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.01332975029338232,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136444
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.0044958914405194205,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.0031117953207879436
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500666,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038911,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7779478264166126,
"mc2_stderr": 0.013764993545897771
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750038
},
"harness|gsm8k|5": {
"acc": 0.6777862016679302,
"acc_stderr": 0.012872435481188776
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zZWipeoutZz/skeleton_slime | ---
license: creativeml-openrail-m
---
<h4> Disclosure </h4>
<p> While its not perfect i hope that you are able to create some nice pieces with it, i am working on improving for the next embedding coming soon, if you have any suggestions or issues please let me know </p>
<h4> Usage </h4>
To use this embedding you have to download the file and put it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt add
<em style="font-weight:600">art by skeleton slime </em>
add <b>[ ]</b> around it to reduce its weight.
<h4> Included Files </h4>
<ul>
<li>6500 steps <em>Usage: art by skeleton slime- 6500</em></li>
<li>10,000 steps <em>Usage: art by skeleton slime-10000</em> </li>
<li>15,000 steps <em>Usage: art by skeleton slime</em></li>
</ul>
cheers<br>
Wipeout
<h4> Example Pictures </h4>
<table>
<tbody>
<tr>
<td><img height="100%/" width="100%" src="https://i.imgur.com/ATm5o4H.png"></td>
<td><img height="100%/" width="100%" src="https://i.imgur.com/DpdwiyC.png"></td>
<td><img height="100%/" width="100%" src="https://i.imgur.com/qwGmnel.png"></td>
</tr>
</tbody>
</table>
<h4> prompt comparison </h4>
<a href="https://i.imgur.com/SF3kfd4.jpg" target="_blank"><img height="100%" width="100%" src="https://i.imgur.com/SF3kfd4.jpg"></a>
<h4> Licence </h4>
<p><span>This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies:</span> </p>
<ol>
<li>You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content </li>
<li>The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license</li>
<li>You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
<a rel="noopener nofollow" href="https://huggingface.co/spaces/CompVis/stable-diffusion-license">Please read the full license here</a></li>
</ol> |
one-sec-cv12/chunk_172 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20080179072.0
num_examples: 209064
download_size: 17204466804
dataset_size: 20080179072.0
---
# Dataset Card for "chunk_172"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tar_21_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tar_21/TAR-21/TAR-21 (Girls' Frontline)
This is the dataset of tar_21/TAR-21/TAR-21 (Girls' Frontline), containing 165 images and their tags.
The core tags of this character are `long_hair, blue_hair, breasts, yellow_eyes, large_breasts, bangs, bow, very_long_hair, animal_ears, rabbit_ears, fake_animal_ears, hair_between_eyes, headgear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 165 | 276.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tar_21_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 165 | 136.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tar_21_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 417 | 289.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tar_21_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 165 | 233.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tar_21_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 417 | 447.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tar_21_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tar_21_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, assault_rifle, bullpup, looking_at_viewer, robot_ears, solo, star_of_david, white_leotard, black_pantyhose, full_body, high_heels, highleg_leotard, jacket, smile, thigh_strap, white_footwear, black_gloves, closed_mouth, fingerless_gloves, holding_gun, shoes, simple_background, white_background, brown_pantyhose, standing |
| 1 | 35 |  |  |  |  |  | 1girl, detached_collar, looking_at_viewer, official_alternate_costume, playboy_bunny, solo, white_leotard, strapless_leotard, black_bowtie, cleavage, black_pantyhose, wrist_cuffs, rabbit_tail, side-tie_leotard, white_background, bare_shoulders, simple_background, smile, star_of_david, covered_navel, fake_tail, blush, holding_tray |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | assault_rifle | bullpup | looking_at_viewer | robot_ears | solo | star_of_david | white_leotard | black_pantyhose | full_body | high_heels | highleg_leotard | jacket | smile | thigh_strap | white_footwear | black_gloves | closed_mouth | fingerless_gloves | holding_gun | shoes | simple_background | white_background | brown_pantyhose | standing | detached_collar | official_alternate_costume | playboy_bunny | strapless_leotard | black_bowtie | cleavage | wrist_cuffs | rabbit_tail | side-tie_leotard | bare_shoulders | covered_navel | fake_tail | blush | holding_tray |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:----------|:--------------------|:-------------|:-------|:----------------|:----------------|:------------------|:------------|:-------------|:------------------|:---------|:--------|:--------------|:-----------------|:---------------|:---------------|:--------------------|:--------------|:--------|:--------------------|:-------------------|:------------------|:-----------|:------------------|:-----------------------------|:----------------|:--------------------|:---------------|:-----------|:--------------|:--------------|:-------------------|:-----------------|:----------------|:------------|:--------|:---------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 35 |  |  |  |  |  | X | | | X | | X | X | X | X | | | | | X | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_castorini__rank_vicuna_7b_v1_fp16 | ---
pretty_name: Evaluation run of castorini/rank_vicuna_7b_v1_fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [castorini/rank_vicuna_7b_v1_fp16](https://huggingface.co/castorini/rank_vicuna_7b_v1_fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_castorini__rank_vicuna_7b_v1_fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T14:11:17.677021](https://huggingface.co/datasets/open-llm-leaderboard/details_castorini__rank_vicuna_7b_v1_fp16/blob/main/results_2024-01-04T14-11-17.677021.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43807693081233745,\n\
\ \"acc_stderr\": 0.034327867059302436,\n \"acc_norm\": 0.4451290963260708,\n\
\ \"acc_norm_stderr\": 0.03526514680417224,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4512725152724823,\n\
\ \"mc2_stderr\": 0.015672269561043818\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4112627986348123,\n \"acc_stderr\": 0.014379441068522077,\n\
\ \"acc_norm\": 0.4462457337883959,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4856602270464051,\n\
\ \"acc_stderr\": 0.0049877289008975955,\n \"acc_norm\": 0.6567416849233221,\n\
\ \"acc_norm_stderr\": 0.004738264944737159\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296558,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296558\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596241,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596241\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707546,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707546\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.02351729433596328,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02351729433596328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.02843867799890954,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.02843867799890954\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5212121212121212,\n \"acc_stderr\": 0.03900828913737301,\n\
\ \"acc_norm\": 0.5212121212121212,\n \"acc_norm_stderr\": 0.03900828913737301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.03521224908841585,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03521224908841585\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5647668393782384,\n \"acc_stderr\": 0.035780381650085846,\n\
\ \"acc_norm\": 0.5647668393782384,\n \"acc_norm_stderr\": 0.035780381650085846\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.024985354923102318,\n\
\ \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.024985354923102318\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5651376146788991,\n \"acc_stderr\": 0.021254631465609283,\n \"\
acc_norm\": 0.5651376146788991,\n \"acc_norm_stderr\": 0.021254631465609283\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5294117647058824,\n \"acc_stderr\": 0.035032352963679944,\n \"\
acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.035032352963679944\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5274261603375527,\n \"acc_stderr\": 0.032498227183013026,\n \
\ \"acc_norm\": 0.5274261603375527,\n \"acc_norm_stderr\": 0.032498227183013026\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.045077322787750874,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.045077322787750874\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.03919415545048408,\n\
\ \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.03919415545048408\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5683760683760684,\n\
\ \"acc_stderr\": 0.0324483553531149,\n \"acc_norm\": 0.5683760683760684,\n\
\ \"acc_norm_stderr\": 0.0324483553531149\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5862068965517241,\n\
\ \"acc_stderr\": 0.01761220408466376,\n \"acc_norm\": 0.5862068965517241,\n\
\ \"acc_norm_stderr\": 0.01761220408466376\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.026897049996382868,\n\
\ \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.026897049996382868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260666,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260666\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4738562091503268,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5048231511254019,\n\
\ \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.5048231511254019,\n\
\ \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413327,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320186,\n \
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320186\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3213820078226858,\n\
\ \"acc_stderr\": 0.011927581352265076,\n \"acc_norm\": 0.3213820078226858,\n\
\ \"acc_norm_stderr\": 0.011927581352265076\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.029624663581159696,\n\
\ \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.029624663581159696\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4068627450980392,\n \"acc_stderr\": 0.01987380200506118,\n \
\ \"acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.01987380200506118\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6023391812865497,\n \"acc_stderr\": 0.03753638955761691,\n\
\ \"acc_norm\": 0.6023391812865497,\n \"acc_norm_stderr\": 0.03753638955761691\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4512725152724823,\n\
\ \"mc2_stderr\": 0.015672269561043818\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6661404893449092,\n \"acc_stderr\": 0.013254029695143351\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/castorini/rank_vicuna_7b_v1_fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-11-17.677021.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-11-17.677021.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- '**/details_harness|winogrande|5_2024-01-04T14-11-17.677021.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T14-11-17.677021.parquet'
- config_name: results
data_files:
- split: 2024_01_04T14_11_17.677021
path:
- results_2024-01-04T14-11-17.677021.parquet
- split: latest
path:
- results_2024-01-04T14-11-17.677021.parquet
---
# Dataset Card for Evaluation run of castorini/rank_vicuna_7b_v1_fp16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [castorini/rank_vicuna_7b_v1_fp16](https://huggingface.co/castorini/rank_vicuna_7b_v1_fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_castorini__rank_vicuna_7b_v1_fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T14:11:17.677021](https://huggingface.co/datasets/open-llm-leaderboard/details_castorini__rank_vicuna_7b_v1_fp16/blob/main/results_2024-01-04T14-11-17.677021.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43807693081233745,
"acc_stderr": 0.034327867059302436,
"acc_norm": 0.4451290963260708,
"acc_norm_stderr": 0.03526514680417224,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4512725152724823,
"mc2_stderr": 0.015672269561043818
},
"harness|arc:challenge|25": {
"acc": 0.4112627986348123,
"acc_stderr": 0.014379441068522077,
"acc_norm": 0.4462457337883959,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.4856602270464051,
"acc_stderr": 0.0049877289008975955,
"acc_norm": 0.6567416849233221,
"acc_norm_stderr": 0.004738264944737159
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296558,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296558
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707546,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707546
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02351729433596328,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02351729433596328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.02843867799890954,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.02843867799890954
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5212121212121212,
"acc_stderr": 0.03900828913737301,
"acc_norm": 0.5212121212121212,
"acc_norm_stderr": 0.03900828913737301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841585,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841585
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5647668393782384,
"acc_stderr": 0.035780381650085846,
"acc_norm": 0.5647668393782384,
"acc_norm_stderr": 0.035780381650085846
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.024985354923102318,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.024985354923102318
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5651376146788991,
"acc_stderr": 0.021254631465609283,
"acc_norm": 0.5651376146788991,
"acc_norm_stderr": 0.021254631465609283
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.035032352963679944,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.035032352963679944
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5274261603375527,
"acc_stderr": 0.032498227183013026,
"acc_norm": 0.5274261603375527,
"acc_norm_stderr": 0.032498227183013026
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.045077322787750874,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.045077322787750874
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4662576687116564,
"acc_stderr": 0.03919415545048408,
"acc_norm": 0.4662576687116564,
"acc_norm_stderr": 0.03919415545048408
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5683760683760684,
"acc_stderr": 0.0324483553531149,
"acc_norm": 0.5683760683760684,
"acc_norm_stderr": 0.0324483553531149
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.01761220408466376,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.01761220408466376
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260666,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5048231511254019,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.5048231511254019,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413327,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320186,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320186
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3213820078226858,
"acc_stderr": 0.011927581352265076,
"acc_norm": 0.3213820078226858,
"acc_norm_stderr": 0.011927581352265076
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.01987380200506118,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.01987380200506118
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6023391812865497,
"acc_stderr": 0.03753638955761691,
"acc_norm": 0.6023391812865497,
"acc_norm_stderr": 0.03753638955761691
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4512725152724823,
"mc2_stderr": 0.015672269561043818
},
"harness|winogrande|5": {
"acc": 0.6661404893449092,
"acc_stderr": 0.013254029695143351
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_rte_referential_thing | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 207126
num_examples: 427
- name: train
num_bytes: 181765
num_examples: 362
download_size: 258625
dataset_size: 388891
---
# Dataset Card for "MULTI_VALUE_rte_referential_thing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lalitharavindan21/newdataset | ---
license: mit
---
|
GATE-engine/cubirds200 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 657406642.5
num_examples: 8204
- name: validation
num_bytes: 136652007.625
num_examples: 1771
- name: test
num_bytes: 143386937.75
num_examples: 1770
download_size: 937336997
dataset_size: 937445587.875
---
# Dataset Card for "cubirds200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shularp/generated_ar_en_th_datasets | ---
dataset_info:
features:
- name: ar
dtype: string
- name: en
dtype: string
- name: th
dtype: string
splits:
- name: train
num_bytes: 168583
num_examples: 584
- name: validation
num_bytes: 75552
num_examples: 251
download_size: 106639
dataset_size: 244135
---
# Dataset Card for "generated_ar_en_th_datasets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Anand8078/esg_collection_4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 10327
num_examples: 116
download_size: 6100
dataset_size: 10327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sriramahesh2000/law-summary | ---
license: apache-2.0
---
|
alisson40889/louca | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_75 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1235283556.0
num_examples: 242593
download_size: 1260641819
dataset_size: 1235283556.0
---
# Dataset Card for "chunk_75"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thiefcat/dataset_repository_name | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cjvt/ssj500k | ---
annotations_creators:
- expert-generated
language_creators:
- found
- expert-generated
language:
- sl
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
- 10K<n<100K
source_datasets: []
task_categories:
- token-classification
task_ids:
- named-entity-recognition
- part-of-speech
- lemmatization
- parsing
pretty_name: ssj500k
tags:
- semantic-role-labeling
- multiword-expression-detection
---
# Dataset Card for ssj500k
**Important**: there exists another HF implementation of the dataset ([classla/ssj500k](https://huggingface.co/datasets/classla/ssj500k)), but it seems to be more narrowly focused. **This implementation is designed for more general use** - the CLASSLA version seems to expose only the specific training/validation/test annotations used in the CLASSLA library, for only a subset of the data.
### Dataset Summary
The ssj500k training corpus contains about 500 000 tokens manually annotated on the levels of tokenization, sentence segmentation, morphosyntactic tagging, and lemmatization. It is also partially annotated for the following tasks:
- named entity recognition (config `named_entity_recognition`)
- dependency parsing(*), Universal Dependencies style (config `dependency_parsing_ud`)
- dependency parsing, JOS/MULTEXT-East style (config `dependency_parsing_jos`)
- semantic role labeling (config `semantic_role_labeling`)
- multi-word expressions (config `multiword_expressions`)
If you want to load all the data along with their partial annotations, please use the config `all_data`.
\* _The UD dependency parsing labels are included here for completeness, but using the dataset [universal_dependencies](https://huggingface.co/datasets/universal_dependencies) should be preferred for dependency parsing applications to ensure you are using the most up-to-date data._
### Supported Tasks and Leaderboards
Sentence tokenization, sentence segmentation, morphosyntactic tagging, lemmatization, named entity recognition, dependency parsing, semantic role labeling, multi-word expression detection.
### Languages
Slovenian.
## Dataset Structure
### Data Instances
A sample instance from the dataset (using the config `all_data`):
```
{
'id_doc': 'ssj1',
'idx_par': 0,
'idx_sent': 0,
'id_words': ['ssj1.1.1.t1', 'ssj1.1.1.t2', 'ssj1.1.1.t3', 'ssj1.1.1.t4', 'ssj1.1.1.t5', 'ssj1.1.1.t6', 'ssj1.1.1.t7', 'ssj1.1.1.t8', 'ssj1.1.1.t9', 'ssj1.1.1.t10', 'ssj1.1.1.t11', 'ssj1.1.1.t12', 'ssj1.1.1.t13', 'ssj1.1.1.t14', 'ssj1.1.1.t15', 'ssj1.1.1.t16', 'ssj1.1.1.t17', 'ssj1.1.1.t18', 'ssj1.1.1.t19', 'ssj1.1.1.t20', 'ssj1.1.1.t21', 'ssj1.1.1.t22', 'ssj1.1.1.t23', 'ssj1.1.1.t24'],
'words': ['"', 'Tistega', 'večera', 'sem', 'preveč', 'popil', ',', 'zgodilo', 'se', 'je', 'mesec', 'dni', 'po', 'tem', ',', 'ko', 'sem', 'izvedel', ',', 'da', 'me', 'žena', 'vara', '.'],
'lemmas': ['"', 'tisti', 'večer', 'biti', 'preveč', 'popiti', ',', 'zgoditi', 'se', 'biti', 'mesec', 'dan', 'po', 'ta', ',', 'ko', 'biti', 'izvedeti', ',', 'da', 'jaz', 'žena', 'varati', '.'],
'msds': ['UPosTag=PUNCT', 'UPosTag=DET|Case=Gen|Gender=Masc|Number=Sing|PronType=Dem', 'UPosTag=NOUN|Case=Gen|Gender=Masc|Number=Sing', 'UPosTag=AUX|Mood=Ind|Number=Sing|Person=1|Polarity=Pos|Tense=Pres|VerbForm=Fin', 'UPosTag=DET|PronType=Ind', 'UPosTag=VERB|Aspect=Perf|Gender=Masc|Number=Sing|VerbForm=Part', 'UPosTag=PUNCT', 'UPosTag=VERB|Aspect=Perf|Gender=Neut|Number=Sing|VerbForm=Part', 'UPosTag=PRON|PronType=Prs|Reflex=Yes|Variant=Short', 'UPosTag=AUX|Mood=Ind|Number=Sing|Person=3|Polarity=Pos|Tense=Pres|VerbForm=Fin', 'UPosTag=NOUN|Animacy=Inan|Case=Acc|Gender=Masc|Number=Sing', 'UPosTag=NOUN|Case=Gen|Gender=Masc|Number=Plur', 'UPosTag=ADP|Case=Loc', 'UPosTag=DET|Case=Loc|Gender=Neut|Number=Sing|PronType=Dem', 'UPosTag=PUNCT', 'UPosTag=SCONJ', 'UPosTag=AUX|Mood=Ind|Number=Sing|Person=1|Polarity=Pos|Tense=Pres|VerbForm=Fin', 'UPosTag=VERB|Aspect=Perf|Gender=Masc|Number=Sing|VerbForm=Part', 'UPosTag=PUNCT', 'UPosTag=SCONJ', 'UPosTag=PRON|Case=Acc|Number=Sing|Person=1|PronType=Prs|Variant=Short', 'UPosTag=NOUN|Case=Nom|Gender=Fem|Number=Sing', 'UPosTag=VERB|Aspect=Imp|Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin', 'UPosTag=PUNCT'],
'has_ne_ann': True,
'has_ud_dep_ann': True,
'has_jos_dep_ann': True,
'has_srl_ann': True,
'has_mwe_ann': True,
'ne_tags': ['O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O'],
'ud_dep_head': [5, 2, 5, 5, 5, -1, 7, 5, 7, 7, 7, 10, 13, 10, 17, 17, 17, 13, 22, 22, 22, 22, 17, 5],
'ud_dep_rel': ['punct', 'det', 'obl', 'aux', 'advmod', 'root', 'punct', 'parataxis', 'expl', 'aux', 'obl', 'nmod', 'case', 'nmod', 'punct', 'mark', 'aux', 'acl', 'punct', 'mark', 'obj', 'nsubj', 'ccomp', 'punct'],
'jos_dep_head': [-1, 2, 5, 5, 5, -1, -1, -1, 7, 7, 7, 10, 13, 10, -1, 17, 17, 13, -1, 22, 22, 22, 17, -1],
'jos_dep_rel': ['Root', 'Atr', 'AdvO', 'PPart', 'AdvM', 'Root', 'Root', 'Root', 'PPart', 'PPart', 'AdvO', 'Atr', 'Atr', 'Atr', 'Root', 'Conj', 'PPart', 'Atr', 'Root', 'Conj', 'Obj', 'Sb', 'Obj', 'Root'],
'srl_info': [
{'idx_arg': 2, 'idx_head': 5, 'role': 'TIME'},
{'idx_arg': 4, 'idx_head': 5, 'role': 'QUANT'},
{'idx_arg': 10, 'idx_head': 7, 'role': 'TIME'},
{'idx_arg': 20, 'idx_head': 22, 'role': 'PAT'},
{'idx_arg': 21, 'idx_head': 22, 'role': 'ACT'},
{'idx_arg': 22, 'idx_head': 17, 'role': 'RESLT'}
],
'mwe_info': [
{'type': 'IRV', 'word_indices': [7, 8]}
]
}
```
### Data Fields
The following attributes are present in the most general config (`all_data`). Please see below for attributes present in the specific configs.
- `id_doc`: a string containing the identifier of the document;
- `idx_par`: an int32 containing the consecutive number of the paragraph, which the current sentence is a part of;
- `idx_sent`: an int32 containing the consecutive number of the current sentence inside the current paragraph;
- `id_words`: a list of strings containing the identifiers of words - potentially redundant, helpful for connecting the dataset with external datasets like coref149;
- `words`: a list of strings containing the words in the current sentence;
- `lemmas`: a list of strings containing the lemmas in the current sentence;
- `msds`: a list of strings containing the morphosyntactic description of words in the current sentence;
- `has_ne_ann`: a bool indicating whether the current example has named entities annotated;
- `has_ud_dep_ann`: a bool indicating whether the current example has dependencies (in UD style) annotated;
- `has_jos_dep_ann`: a bool indicating whether the current example has dependencies (in JOS style) annotated;
- `has_srl_ann`: a bool indicating whether the current example has semantic roles annotated;
- `has_mwe_ann`: a bool indicating whether the current example has multi-word expressions annotated;
- `ne_tags`: a list of strings containing the named entity tags encoded using IOB2 - if `has_ne_ann=False` all tokens are annotated with `"N/A"`;
- `ud_dep_head`: a list of int32 containing the head index for each word (using UD guidelines) - the head index of the root word is `-1`; if `has_ud_dep_ann=False` all tokens are annotated with `-2`;
- `ud_dep_rel`: a list of strings containing the relation with the head for each word (using UD guidelines) - if `has_ud_dep_ann=False` all tokens are annotated with `"N/A"`;
- `jos_dep_head`: a list of int32 containing the head index for each word (using JOS guidelines) - the head index of the root word is `-1`; if `has_jos_dep_ann=False` all tokens are annotated with `-2`;
- `jos_dep_rel`: a list of strings containing the relation with the head for each word (using JOS guidelines) - if `has_jos_dep_ann=False` all tokens are annotated with `"N/A"`;
- `srl_info`: a list of dicts, each containing index of the argument word, the head (verb) word, and the semantic role - if `has_srl_ann=False` this list is empty;
- `mwe_info`: a list of dicts, each containing word indices and the type of a multi-word expression;
#### Data fields in 'named_entity_recognition'
```
['id_doc', 'idx_par', 'idx_sent', 'id_words', 'words', 'lemmas', 'msds', 'ne_tags']
```
#### Data fields in 'dependency_parsing_ud'
```
['id_doc', 'idx_par', 'idx_sent', 'id_words', 'words', 'lemmas', 'msds', 'ud_dep_head', 'ud_dep_rel']
```
#### Data fields in 'dependency_parsing_jos'
```
['id_doc', 'idx_par', 'idx_sent', 'id_words', 'words', 'lemmas', 'msds', 'jos_dep_head', 'jos_dep_rel']
```
#### Data fields in 'semantic_role_labeling'
```
['id_doc', 'idx_par', 'idx_sent', 'id_words', 'words', 'lemmas', 'msds', 'srl_info']
```
#### Data fields in 'multiword_expressions'
```
['id_doc', 'idx_par', 'idx_sent', 'id_words', 'words', 'lemmas', 'msds', 'mwe_info']
```
## Additional Information
### Dataset Curators
Simon Krek; et al. (please see http://hdl.handle.net/11356/1434 for the full list)
### Licensing Information
CC BY-NC-SA 4.0.
### Citation Information
The paper describing the dataset:
```
@InProceedings{krek2020ssj500k,
title = {The ssj500k Training Corpus for Slovene Language Processing},
author={Krek, Simon and Erjavec, Tomaž and Dobrovoljc, Kaja and Gantar, Polona and Arhar Holdt, Spela and Čibej, Jaka and Brank, Janez},
booktitle={Proceedings of the Conference on Language Technologies and Digital Humanities},
year={2020},
pages={24-33}
}
```
The resource itself:
```
@misc{krek2021clarinssj500k,
title = {Training corpus ssj500k 2.3},
author = {Krek, Simon and Dobrovoljc, Kaja and Erjavec, Toma{\v z} and Mo{\v z}e, Sara and Ledinek, Nina and Holz, Nanika and Zupan, Katja and Gantar, Polona and Kuzman, Taja and {\v C}ibej, Jaka and Arhar Holdt, {\v S}pela and Kav{\v c}i{\v c}, Teja and {\v S}krjanec, Iza and Marko, Dafne and Jezer{\v s}ek, Lucija and Zajc, Anja},
url = {http://hdl.handle.net/11356/1434},
year = {2021} }
```
### Contributions
Thanks to [@matejklemen](https://github.com/matejklemen) for adding this dataset. |
pvrancx/tyk2_fep | ---
license: apache-2.0
dataset_info:
features:
- name: Smiles
dtype: string
- name: DockingScore
dtype: float64
- name: dG
dtype: float64
- name: dGError
dtype: float64
splits:
- name: train
num_bytes: 641714
num_examples: 8997
- name: test
num_bytes: 71163
num_examples: 1000
download_size: 315048
dataset_size: 712877
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
tags:
- molecule
- chemistry
- smiles
- free_energy
size_categories:
- 1K<n<10K
---
Molecular dataset: 10,000 TYK2 inhibitors (SMILES strings) with Docking scores and Relative Binding Free Energy (dG)
Dataset from paper:
James Thompson, W Patrick Walters, Jianwen A Feng, Nicolas A Pabon, Hongcheng Xu, Michael Maser, Brian B Goldman, Demetri Moustakas, Molly Schmidt, Forrest York,
Optimizing active learning for free energy calculations, Artificial Intelligence in the Life Sciences, Volume 2, 2022, 100050, ISSN 2667-3185,
https://doi.org/10.1016/j.ailsci.2022.100050.
https://www.sciencedirect.com/science/article/pii/S2667318522000204
original source: https://github.com/google-research/google-research/tree/master/al_for_fep |
open-llm-leaderboard/details_neovalle__H4rmoniousAnthea | ---
pretty_name: Evaluation run of neovalle/H4rmoniousAnthea
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [neovalle/H4rmoniousAnthea](https://huggingface.co/neovalle/H4rmoniousAnthea)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_neovalle__H4rmoniousAnthea\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T05:56:18.084977](https://huggingface.co/datasets/open-llm-leaderboard/details_neovalle__H4rmoniousAnthea/blob/main/results_2024-01-25T05-56-18.084977.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.630582687926132,\n\
\ \"acc_stderr\": 0.03217475637386003,\n \"acc_norm\": 0.6405543770437054,\n\
\ \"acc_norm_stderr\": 0.03288367770300838,\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.016909693580248818,\n \"mc2\": 0.5507958189879629,\n\
\ \"mc2_stderr\": 0.015408052923903376\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497731\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n\
\ \"acc_stderr\": 0.00475088440109516,\n \"acc_norm\": 0.8408683529177454,\n\
\ \"acc_norm_stderr\": 0.003650512158306266\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455495,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509986,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509986\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n\
\ \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005566,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005566\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342506,\n\
\ \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342506\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"\
acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.016909693580248818,\n \"mc2\": 0.5507958189879629,\n\
\ \"mc2_stderr\": 0.015408052923903376\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12964366944655042,\n \
\ \"acc_stderr\": 0.009252657757825552\n }\n}\n```"
repo_url: https://huggingface.co/neovalle/H4rmoniousAnthea
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-56-18.084977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-56-18.084977.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- '**/details_harness|winogrande|5_2024-01-25T05-56-18.084977.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T05-56-18.084977.parquet'
- config_name: results
data_files:
- split: 2024_01_25T05_56_18.084977
path:
- results_2024-01-25T05-56-18.084977.parquet
- split: latest
path:
- results_2024-01-25T05-56-18.084977.parquet
---
# Dataset Card for Evaluation run of neovalle/H4rmoniousAnthea
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [neovalle/H4rmoniousAnthea](https://huggingface.co/neovalle/H4rmoniousAnthea) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_neovalle__H4rmoniousAnthea",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T05:56:18.084977](https://huggingface.co/datasets/open-llm-leaderboard/details_neovalle__H4rmoniousAnthea/blob/main/results_2024-01-25T05-56-18.084977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.630582687926132,
"acc_stderr": 0.03217475637386003,
"acc_norm": 0.6405543770437054,
"acc_norm_stderr": 0.03288367770300838,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248818,
"mc2": 0.5507958189879629,
"mc2_stderr": 0.015408052923903376
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497731
},
"harness|hellaswag|10": {
"acc": 0.6528579964150567,
"acc_stderr": 0.00475088440109516,
"acc_norm": 0.8408683529177454,
"acc_norm_stderr": 0.003650512158306266
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455495,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509986,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474086,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474086
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005566,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248818,
"mc2": 0.5507958189879629,
"mc2_stderr": 0.015408052923903376
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
},
"harness|gsm8k|5": {
"acc": 0.12964366944655042,
"acc_stderr": 0.009252657757825552
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.