datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
chohi/private_processed_demo | ---
dataset_info:
features:
- name: input
dtype: int64
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
splits:
- name: train
num_bytes: 2558
num_examples: 3
download_size: 6384
dataset_size: 2558
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vageesh1/Smart_Contract_HF | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: address
dtype: string
- name: source_code
dtype: string
- name: bytecode
dtype: string
- name: slither
dtype: string
- name: success
dtype: bool
- name: error
dtype: float64
- name: results
dtype: string
splits:
- name: train
num_bytes: 2542103760
num_examples: 60000
download_size: 814198965
dataset_size: 2542103760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Smart_Contract_HF"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hac541309/multilingual_tokenizers | ---
license: other
---
Collection of tokenizers from various sources.
My own are Apache 2.0 but others are not.
They are each accompanied by their license.
|
zdy023/WikiHow-taskset | ---
license: apache-2.0
---
(Works with [Mobile-Env v3.x](https://github.com/X-LANCE/Mobile-Env/tree/v3.0).)
# WikiHow Task Set
WikiHow task set is an InfoUI interaction task set based on
[Mobile-Env](https://github.com/X-LANCE/Mobile-Env) proposed in [*Mobile-Env:
An Evaluation Platform and Benchmark for Interactive Agents in LLM
Era*](https://arxiv.org/abs/2305.08144).
[WikiHow](https://www.wikihow.com/Main-Page) is a collaborative wiki site about
various real-life tips with more than 340,000 online articles. To construct the
task set, 107,448 pages are crawled, and the dumped website data occupy about
88 GiB totally.
Several task definition templates are designed according to the functions of
WikiHow app and task definitions are instantiated through the template toolkit
in Mobile-Env. 577 tasks are sampled from the extended set, which is named the
*canonical set* (`wikihow-canonical.tar.xz`). Owing to the limit of the
budgets, only 150 tasks are tested using the proposed LLM-based agent. These
150 tasks are given in `wikihow-microcanon.tar.xz`. We call it the *canonical
subset* or the *micro canonical set*.
### Website Data Replay
The replay script for [mitmproxy](https://mitmproxy.org/) is given as
`replay_url.py`. To use this replay script, the information retrieval tool
[Pyserini](https://github.com/castorini/pyserini/) is required. Four parameters
are expected to be assigned in the script:
+ The crawled data from WikiHow website (`dumps` in `wikihow.data.tar.xz`)
+ The HTML templates used to mock the search result page (`templates` in
`wikihow.data.tar.xz`)
+ The indices for the search engine based on Pyserini (`indices-t/indices` in
`wikihow.data.tar.xz`)
+ The metadata of the crawled articles (`indices-t/docs/doc_meta.csv` in
`wikihow.data.tar.xz`)
All the required data are offered in `wikihow.data.tar.xz`. (The archive is
about 78 GiB. And the decompressed data are about 88 GiB.) The archive is split
into two pieces (`wikihow.data.tar.xz.00` and `wikihow.data.tar.xz.01`). You
can use `cat` to concatenate them:
```sh
cat wikihow.data.tar.xz.00 wikihow.data.tar.xz.01 >wikihow.data.tar.xz
```
The SHA256 checksums are provided in `wikihow.data.tar.xz.sha256` to check the
integrity.
To run the script:
```sh
mitmproxy --showhost -s replay_url.py
```
### Certificate Unpinning Plan
The `syscert` plan proposed by Mobile-Env works for WikiHow app. You can
complete the config according to the [guideline of
Mobile-Env](https://github.com/X-LANCE/Mobile-Env/blob/master/docs/dynamic-app-en.md).
The available APK package from [APKCombo](https://apkcombo.com/) is provided.
And note to use the AVD image of version Android 11.0 (API Level 30) (Google
APIs) to obtain the best compatibility and the root-enabled ADBD.
### Human-Rewritten Instructions
Human-rewritten instructions for the *canonical set* are release under
`instruction_rewriting/`. An AndroidEnv wrapper `InstructionRewritingWrapper`
is provided to load the rewritten instructions (`merged_doccano.json`) and
public patterns (`pattern-*.txt`). The annotations are collected via
[doccano](https://doccano.github.io/doccano/). The patterns are parsed by
[`sentence_pattern.py`](instruction_rewriting/sentence_pattern.py).
### Details of Sub-Tasks
WikiHow taks are crafted from 16 types of sub-tasks:
* `home2search`, instructing to search for an article from the home page.
* `search2article`, `author2article`, & `category2article`, instructing to
access an article from search result page, author information page, and
category content page, respectively.
* `article2about`, instructing to access the about page from article page.
* `article2author`, instructing to access author information page from article
page.
* `article2category`, instructing to access category content page from article
page.
* `article2reference`, instructing to check reference list on article page.
* `article2rate_no`, instructing to rate no for article
* `article2rate_yes`, instructing to rate yes for article
* `article2share`, instructing to share article
* `article2bookmark`, instructing to bookmark article and then check the
bookmarks.
* `article2steps`, crafted from `stepped_summary` questions in
[wikihow-lists](https://huggingface.co/datasets/b-mc2/wikihow_lists)
* `article2ingredientes`, crafted from `ingredients` questions in
[wikihow-lists](https://huggingface.co/datasets/b-mc2/wikihow_lists)
* `article2needed_items`, crafted from `needed_items` questions in
[wikihow-lists](https://huggingface.co/datasets/b-mc2/wikihow_lists)
* `article2summary`, crafted from
[WikiHowNFQA](https://huggingface.co/datasets/Lurunchik/WikiHowNFQA) tasks
A template is composed for each sub-task, containing a group of filling slots
expecting some keywords like article title, author name, question, and
groundtruth answer. Then these keywords are sampled from the crawled app data
or from the two QA datasets to instantiate the templates. Subsequently, the
instantiated templates are concatenated into multi-stage task definitions under
the constraint that the target page/element/answer (the part after `2`, *e.g.*,
`share` from `article2share`) is directly on/referenced by the current page
(the part before `2`, *e.g.*, `article` from `article2share`). Finally, we
obtained the task set of 150 multistage tasks in which there are 2.68
single-stage sub-tasks averagely.
### About
This task set is developed and maintained by [SJTU
X-Lance](https://x-lance.sjtu.edu.cn/en). The corresponding paper is available
at <https://arxiv.org/abs/2305.08144>.
If you find WikiHow task set useful in your research, you can cite the project
using the following BibTeX:
```bibtex
@article{DanyangZhang2023_MobileEnv_WikiHow,
title = {{Mobile-Env}: An Evaluation Platform and Benchmark for LLM-GUI Interaction},
author = {Danyang Zhang and
Lu Chen and
Zihan Zhao and
Ruisheng Cao and
Kai Yu},
journal = {CoRR},
volume = {abs/2305.08144},
year = {2023},
url = {https://arxiv.org/abs/2305.08144},
eprinttype = {arXiv},
eprint = {2305.08144},
}
```
|
PJMixers/grimulkan_bluemoon_Karen_cleaned-carded-failures | ---
language:
- en
source_datasets: grimulkan/bluemoon_Karen_cleaned
tags:
- not-for-all-audiences
---
These are all the samples which failed to have a card generated with the correct formatting within 5 attempts. The included card is the last attempt.
A card generation will fail if all these cases are not met:
1. Card must contain the correct tags starting the correct lines.
2. Card must not contain tags more than once.
3. Card must not contain any spaces before new lines.
4. Card must not contain any unwanted tags. (`'<|system|>', '<|user|>', '<|model|>', '<|FIRST_CHARACTER_MESSAGE|>', '<|SECOND_CHARACTER_MESSAGE|>'`)
5. Card must be generated with a stop token to complete.
*Note: Samples which would end up longer than 8192 tokens, or samples with less than 4 turns were ignored completely and not saved here.* |
EgilKarlsen/BGL_BERT_Finetuned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211883038
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_BERT_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
persian_ner | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- fa
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: Persian NER
dataset_info:
- config_name: fold1
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': I-event
'2': I-fac
'3': I-loc
'4': I-org
'5': I-pers
'6': I-pro
'7': B-event
'8': B-fac
'9': B-loc
'10': B-org
'11': B-pers
'12': B-pro
splits:
- name: train
num_bytes: 3362102
num_examples: 5121
- name: test
num_bytes: 1646481
num_examples: 2560
download_size: 1931170
dataset_size: 5008583
- config_name: fold2
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': I-event
'2': I-fac
'3': I-loc
'4': I-org
'5': I-pers
'6': I-pro
'7': B-event
'8': B-fac
'9': B-loc
'10': B-org
'11': B-pers
'12': B-pro
splits:
- name: train
num_bytes: 3344561
num_examples: 5120
- name: test
num_bytes: 1664022
num_examples: 2561
download_size: 1931170
dataset_size: 5008583
- config_name: fold3
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': I-event
'2': I-fac
'3': I-loc
'4': I-org
'5': I-pers
'6': I-pro
'7': B-event
'8': B-fac
'9': B-loc
'10': B-org
'11': B-pers
'12': B-pro
splits:
- name: train
num_bytes: 3310491
num_examples: 5121
- name: test
num_bytes: 1698092
num_examples: 2560
download_size: 1931170
dataset_size: 5008583
---
# Dataset Card for [Persian NER]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/HaniehP/PersianNER)
- **Repository:** [Github](https://github.com/HaniehP/PersianNER)
- **Paper:** [Aclweb](https://www.aclweb.org/anthology/C16-1319)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The dataset includes 7,682 Persian sentences, split into 250,015 tokens and their NER labels. It is available in 3 folds to be used in turn as training and test sets. The NER tags are in IOB format.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
### Data Fields
- `id`: id of the sample
- `tokens`: the tokens of the example text
- `ner_tags`: the NER tags of each token
The NER tags correspond to this list:
```
"O", "I-event", "I-fac", "I-loc", "I-org", "I-pers", "I-pro", "B-event", "B-fac", "B-loc", "B-org", "B-pers", "B-pro"
```
### Data Splits
Training and test splits
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Hanieh Poostchi, Ehsan Zare Borzeshi, Mohammad Abdous, Massimo Piccardi
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
Hanieh Poostchi, Ehsan Zare Borzeshi, Mohammad Abdous, Massimo Piccardi
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
Dataset is published for academic use only
### Dataset Curators
[More Information Needed]
### Licensing Information
Creative Commons Attribution 4.0 International License.
### Citation Information
@inproceedings{poostchi-etal-2016-personer,
title = "{P}erso{NER}: {P}ersian Named-Entity Recognition",
author = "Poostchi, Hanieh and
Zare Borzeshi, Ehsan and
Abdous, Mohammad and
Piccardi, Massimo",
booktitle = "Proceedings of {COLING} 2016, the 26th International Conference on Computational Linguistics: Technical Papers",
month = dec,
year = "2016",
address = "Osaka, Japan",
publisher = "The COLING 2016 Organizing Committee",
url = "https://www.aclweb.org/anthology/C16-1319",
pages = "3381--3389",
abstract = "Named-Entity Recognition (NER) is still a challenging task for languages with low digital resources. The main difficulties arise from the scarcity of annotated corpora and the consequent problematic training of an effective NER pipeline. To abridge this gap, in this paper we target the Persian language that is spoken by a population of over a hundred million people world-wide. We first present and provide ArmanPerosNERCorpus, the first manually-annotated Persian NER corpus. Then, we introduce PersoNER, an NER pipeline for Persian that leverages a word embedding and a sequential max-margin classifier. The experimental results show that the proposed approach is capable of achieving interesting MUC7 and CoNNL scores while outperforming two alternatives based on a CRF and a recurrent neural network.",
}
### Contributions
Thanks to [@KMFODA](https://github.com/KMFODA) for adding this dataset. |
ChristophSchuhmann/test | ---
license: apache-2.0
---
|
CyberHarem/naga_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of naga (Fire Emblem)
This is the dataset of naga (Fire Emblem), containing 30 images and their tags.
The core tags of this character are `long_hair, pointy_ears, green_hair, breasts, green_eyes, very_long_hair, large_breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 34.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 21.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 35.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 31.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 48.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/naga_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, dress, navel, looking_at_viewer, bracelet, smile |
| 1 | 5 |  |  |  |  |  | 1boy, hetero, nude, tiara, nipples, penis, 1girl, blush, cum_on_breasts, ejaculation, facial, paizuri, uncensored, 3girls, abs, ass, group_sex, looking_at_viewer, parted_lips, ponytail, ribbon, smile, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | dress | navel | looking_at_viewer | bracelet | smile | 1boy | hetero | nude | tiara | nipples | penis | blush | cum_on_breasts | ejaculation | facial | paizuri | uncensored | 3girls | abs | ass | group_sex | parted_lips | ponytail | ribbon | solo_focus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:--------------------|:-----------|:--------|:-------|:---------|:-------|:--------|:----------|:--------|:--------|:-----------------|:--------------|:---------|:----------|:-------------|:---------|:------|:------|:------------|:--------------|:-----------|:---------|:-------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
denis-berezutskiy-lad/ru_transcription_punctuation | ---
license: apache-2.0
---
# About
This is a dataset for training Russian punctuators/capitalizers via NeMo scripts (https://github.com/NVIDIA/NeMo)
A BERT model already fine-tuned on this dataset can be found here: https://huggingface.co/denis-berezutskiy-lad/lad_transcription_bert_ru_punctuator
Scripts for collecting/updating such a dataset, as well as training/using the model are located here: https://github.com/denis-berezutskiy-lad/transcription-bert-ru-punctuator-scripts/tree/main
The idea behind the project is to use large continous professional transcriptions for training rather than relying on short low-quality samples consisting of 1-2 sentences (which is typical for the most popular datasets in Russian). Our experiments show significant improvements comparing to BERTs trained on the standard Ru datasets (social comments, omnia russica etc.). That's why we mainly use transcriptions published by Russian legislatures (Gosduma, Mosgorduma) with some addition of film subtitles from OpenSubtitles project.
The dataset is in .csv format, but can be easily converted to the NeMo format (text.txt and labels.txt) - see the custom scripts above.
About 1.2 GB of data is from Gosduma, ~300 MB from Mosgorduma and ~300 MB from a snapshot of Russian OpenSubtitles, taken from the Taiga project (https://tatianashavrina.github.io/taiga_site/downloads.html ).
The rows are ordered randomly by the source document ("source_entity" column), however within a document the order is preserved like it was in the original text. So if you need to regroup the texts (for example, make longer or shorter sequences), you may concatenate rows grouped by the source entity and then split them as you want.
# Supported labels
Please note that some new labels are not supported by NeMo scripts out of the box (-, —, T), so we need to add special handling for them. See the inference notebook for details.
## Punctuation
O,.?!:;…⁈-—
## Capitalization
OUT
(T means abbreviation ("total" uppercase)) |
FINNUMBER/FINCH_TRAIN_SA_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7624635
num_examples: 3343
download_size: 3661933
dataset_size: 7624635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kumuyu_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kumuyu (Granblue Fantasy)
This is the dataset of kumuyu (Granblue Fantasy), containing 16 images and their tags.
The core tags of this character are `horns, breasts, large_breasts, twintails, long_hair, brown_eyes, yellow_eyes, low_twintails, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 21.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 12.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 25.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 19.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 37.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kumuyu_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, draph, solo, looking_at_viewer, collarbone, hood, long_sleeves, open_mouth, sweat, wide_sleeves, breasts_out, inverted_nipples, navel, smile, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | draph | solo | looking_at_viewer | collarbone | hood | long_sleeves | open_mouth | sweat | wide_sleeves | breasts_out | inverted_nipples | navel | smile | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-------|:--------------------|:-------------|:-------|:---------------|:-------------|:--------|:---------------|:--------------|:-------------------|:--------|:--------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/marc_female_fire_emblem_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of marc_female_fire_emblem (Fire Emblem)
This is the dataset of marc_female_fire_emblem (Fire Emblem), containing 23 images and their tags.
The core tags of this character are `black_hair, short_hair, purple_eyes, ahoge, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 24.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marc_female_fire_emblem_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 16.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marc_female_fire_emblem_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 47 | 28.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marc_female_fire_emblem_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 23.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marc_female_fire_emblem_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 47 | 36.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marc_female_fire_emblem_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/marc_female_fire_emblem_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | hood_down, long_sleeves, simple_background, solo, 1girl, holding_book, open_mouth, smile, full_body, looking_at_viewer, knee_boots, open_book, white_background, 1boy, bangs, black_gloves, brown_footwear, male_focus, thighhighs |
| 1 | 5 |  |  |  |  |  | long_sleeves, 1girl, hood_down, looking_at_viewer, open_mouth, 1boy, :d, black_gloves, solo, balloon, bangs, belt, blush, robe, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | hood_down | long_sleeves | simple_background | solo | 1girl | holding_book | open_mouth | smile | full_body | looking_at_viewer | knee_boots | open_book | white_background | 1boy | bangs | black_gloves | brown_footwear | male_focus | thighhighs | :d | balloon | belt | blush | robe | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------|:---------------|:--------------------|:-------|:--------|:---------------|:-------------|:--------|:------------|:--------------------|:-------------|:------------|:-------------------|:-------|:--------|:---------------|:-----------------|:-------------|:-------------|:-----|:----------|:-------|:--------|:-------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | X | X | X | | | | X | X | X | X | X | X |
|
zolak/twitter_dataset_79_1713088440 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3451546
num_examples: 8264
download_size: 1708006
dataset_size: 3451546
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
presencesw/multinli | ---
dataset_info:
features:
- name: gold_label
dtype: string
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: train
num_bytes: 75405059.0
num_examples: 392702
- name: dev_matched
num_bytes: 1855725.976
num_examples: 9815
- name: dev_mismatched
num_bytes: 1970757.5424
num_examples: 9832
download_size: 52413708
dataset_size: 79231542.5184
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev_matched
path: data/dev_matched-*
- split: dev_mismatched
path: data/dev_mismatched-*
---
|
roleplay4fun/pippa | ---
dataset_info:
features:
- name: bot_name
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
- name: memory
dtype: string
- name: examples
dtype: string
- name: user_name
dtype: string
- name: nsfw
dtype: bool
- name: metdata
struct:
- name: categories
sequence: string
splits:
- name: train
num_bytes: 231573863
num_examples: 16832
download_size: 114298464
dataset_size: 231573863
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Q-bert/test-dataset | ---
license: mit
---
|
CyberHarem/conte_di_cavour_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of conte_di_cavour (Kantai Collection)
This is the dataset of conte_di_cavour (Kantai Collection), containing 362 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, grey_hair, two_side_up, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 362 | 413.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 362 | 245.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 867 | 534.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 362 | 374.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 867 | 749.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/conte_di_cavour_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/conte_di_cavour_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blush, fish_print, yukata, wide_sleeves, long_sleeves, obi, official_alternate_costume, solo, collarbone, print_kimono, cleavage, white_kimono, sitting, white_background, white_hair, open_mouth |
| 1 | 20 |  |  |  |  |  | 1girl, long_sleeves, white_kimono, miko, red_hakama, wide_sleeves, solo, blush, hakama_skirt, open_mouth, simple_background, cleavage, collarbone, official_alternate_costume, white_hair, white_background, holding, smile, twitter_username |
| 2 | 10 |  |  |  |  |  | 1girl, cleavage, solo, black_bikini, blush, simple_background, white_background, cowboy_shot, collarbone, navel, closed_mouth, necklace, purple_eyes, smile, white_hair, looking_at_viewer, open_mouth |
| 3 | 7 |  |  |  |  |  | 1girl, dated, one-hour_drawing_challenge, solo, simple_background, twitter_username, white_background, blush, cowboy_shot, upper_body, bikini, blue_one-piece_swimsuit, cleavage, collarbone, competition_swimsuit, looking_at_viewer, navel, open_mouth |
| 4 | 5 |  |  |  |  |  | 1girl, cleavage_cutout, frilled_dress, layered_dress, simple_background, solo, white_background, white_dress, white_gloves, corset, looking_at_viewer, two-tone_dress, smile, upper_body, one-hour_drawing_challenge, sleeveless_dress |
| 5 | 5 |  |  |  |  |  | 1girl, cleavage_cutout, corset, frilled_dress, grey_dress, layered_dress, simple_background, solo, two-tone_dress, white_background, white_dress, white_gloves, purple_eyes, short_sleeves, upper_body |
| 6 | 7 |  |  |  |  |  | 1girl, blush, cleavage_cutout, frilled_dress, layered_dress, long_sleeves, solo, white_dress, grey_dress, simple_background, two-tone_dress, looking_at_viewer, white_background, closed_mouth |
| 7 | 5 |  |  |  |  |  | 1girl, cleavage_cutout, frilled_dress, grey_dress, layered_dress, long_sleeves, solo, two-tone_dress, white_dress, corset, open_mouth, smile, blush, armpit_cutout |
| 8 | 5 |  |  |  |  |  | 1girl, blush, cleavage_cutout, frilled_dress, layered_dress, two-tone_dress, white_dress, white_gloves, open_mouth, solo, grey_dress, looking_at_viewer, short_sleeves, twitter_username, upper_body |
| 9 | 11 |  |  |  |  |  | 1girl, cleavage, day, open_mouth, outdoors, smile, solo, blue_sky, cloud, navel, looking_at_viewer, side-tie_bikini_bottom, blush, collarbone, cowboy_shot, ocean, black_bikini, necklace, purple_eyes |
| 10 | 7 |  |  |  |  |  | neckerchief, sailor_dress, sleeveless_dress, white_dress, white_sailor_collar, 1girl, blush, cosplay, solo, striped, simple_background, cowboy_shot, open_mouth, sideboob, white_background |
| 11 | 6 |  |  |  |  |  | 1girl, completely_nude, navel, nipples, solo, blush, collarbone, purple_eyes |
| 12 | 8 |  |  |  |  |  | 1girl, solo, long_sleeves, white_shirt, collared_shirt, cowboy_shot, pleated_skirt, one-hour_drawing_challenge, school_uniform, simple_background, white_background, black_skirt, blush, jacket, official_alternate_costume, open_mouth, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | fish_print | yukata | wide_sleeves | long_sleeves | obi | official_alternate_costume | solo | collarbone | print_kimono | cleavage | white_kimono | sitting | white_background | white_hair | open_mouth | miko | red_hakama | hakama_skirt | simple_background | holding | smile | twitter_username | black_bikini | cowboy_shot | navel | closed_mouth | necklace | purple_eyes | looking_at_viewer | dated | one-hour_drawing_challenge | upper_body | bikini | blue_one-piece_swimsuit | competition_swimsuit | cleavage_cutout | frilled_dress | layered_dress | white_dress | white_gloves | corset | two-tone_dress | sleeveless_dress | grey_dress | short_sleeves | armpit_cutout | day | outdoors | blue_sky | cloud | side-tie_bikini_bottom | ocean | neckerchief | sailor_dress | white_sailor_collar | cosplay | striped | sideboob | completely_nude | nipples | white_shirt | collared_shirt | pleated_skirt | school_uniform | black_skirt | jacket |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-------------|:---------|:---------------|:---------------|:------|:-----------------------------|:-------|:-------------|:---------------|:-----------|:---------------|:----------|:-------------------|:-------------|:-------------|:-------|:-------------|:---------------|:--------------------|:----------|:--------|:-------------------|:---------------|:--------------|:--------|:---------------|:-----------|:--------------|:--------------------|:--------|:-----------------------------|:-------------|:---------|:--------------------------|:-----------------------|:------------------|:----------------|:----------------|:--------------|:---------------|:---------|:-----------------|:-------------------|:-------------|:----------------|:----------------|:------|:-----------|:-----------|:--------|:-------------------------|:--------|:--------------|:---------------|:----------------------|:----------|:----------|:-----------|:------------------|:----------|:--------------|:-----------------|:----------------|:-----------------|:--------------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 20 |  |  |  |  |  | X | X | | | X | X | | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | | | | | | | X | X | | X | | | X | X | X | | | | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | | | | | | X | X | | X | | | X | | X | | | | X | | | X | | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | | | X | | | | | | X | | | | | | X | | X | | | | | | | | X | | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | | | | X | | | | | | X | | | | | | X | | | | | | | | | X | | | | X | | | | X | X | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | | | | X | | | X | | | | | | X | | | | | | X | | | | | | | X | | | X | | | | | | | X | X | X | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | | | X | | | X | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | X | | | | | | | X | | | | | | | | X | | | | | | | X | | | | | | | X | | | X | | | | X | X | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | |
| 9 | 11 |  |  |  |  |  | X | X | | | | | | | X | X | | X | | | | | X | | | | | | X | | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | X | | | | | | | X | | | | | | X | | X | | | | X | | | | | X | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | X | | | | | | | X | X | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | |
| 12 | 8 |  |  |  |  |  | X | X | | | | X | | X | X | | | | | | X | | X | | | | X | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
TagsTest2024/tiny_llava_3 | ---
dataset_info:
features:
- name: URL
dtype: string
- name: TEXT
dtype: string
- name: tiny_llava
dtype: string
splits:
- name: anime_sfw_5000_test1
num_bytes: 7430026
num_examples: 5000
- name: ase6.5_5000_test1
num_bytes: 2649582
num_examples: 5000
- name: journeydb_5000_test1
num_bytes: 5954688
num_examples: 5000
download_size: 8386391
dataset_size: 16034296
configs:
- config_name: default
data_files:
- split: anime_sfw_5000_test1
path: data/anime_sfw_5000_test1-*
- split: ase6.5_5000_test1
path: data/ase6.5_5000_test1-*
- split: journeydb_5000_test1
path: data/journeydb_5000_test1-*
---
|
yuan-sf63/word_label_0.2_32_Nf | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
splits:
- name: train
num_bytes: 21579383.25483862
num_examples: 63520
- name: validation
num_bytes: 2397784.7451613816
num_examples: 7058
download_size: 5532335
dataset_size: 23977168.0
---
# Dataset Card for "word_label_0.2_32_Nf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polinaeterna/test_push_two_confs | ---
dataset_info:
features:
- name: x
dtype: int64
- name: y
dtype: string
splits:
- name: train
num_bytes: 120
num_examples: 8
- name: test
num_bytes: 46
num_examples: 3
download_size: 1712
dataset_size: 166
---
# Dataset Card for "test_push_two_confs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HorcruxNo13/toolwear_complete_tool | ---
dataset_info:
features:
- name: name
dtype: string
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 464993039.0
num_examples: 43
download_size: 117710589
dataset_size: 464993039.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yan2069/ProfessorGarlick | ---
license: openrail
---
|
BangumiBase/rokudounoonnatachi | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Rokudou No Onna-tachi
This is the image base of bangumi Rokudou no Onna-tachi, we detected 31 characters, 3153 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 347 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 64 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 126 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 104 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 61 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 21 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 60 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 16 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 705 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 88 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 87 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 21 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 121 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 130 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 15 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 53 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 55 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 101 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 9 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 22 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 105 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 64 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 9 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 43 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 283 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 55 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 193 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 9 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 7 | [Download](28/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 29 | 6 | [Download](29/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 173 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Rewcifer/outputs_3models_2k | ---
dataset_info:
features:
- name: labels
dtype: string
- name: true_findings
dtype: string
- name: generated_texts_1
dtype: string
- name: row_number
dtype: int64
- name: generated_texts_2
dtype: string
- name: generated_texts_3
dtype: string
splits:
- name: train
num_bytes: 13378570
num_examples: 2000
download_size: 3697877
dataset_size: 13378570
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
srilaxmii-d04/sample_resumes | ---
dataset_info:
features:
- name: ID
dtype: int64
- name: Resume_str
dtype: string
- name: Resume_html
dtype: string
- name: Category
dtype: string
splits:
- name: train
num_bytes: 38342346.1489533
num_examples: 1738
- name: test
num_bytes: 16457646.851046698
num_examples: 746
download_size: 20336204
dataset_size: 54799993.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
rinabuoy/Eng-Khmer-Agg-Local-Reverse | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 2619029
num_examples: 5911
- name: train
num_bytes: 31194559
num_examples: 87400
download_size: 12394563
dataset_size: 33813588
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_kevin009__Llamafia | ---
pretty_name: Evaluation run of kevin009/Llamafia
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kevin009/Llamafia](https://huggingface.co/kevin009/Llamafia) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__Llamafia\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T11:18:55.714824](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__Llamafia/blob/main/results_2024-01-16T11-18-55.714824.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6211061177095942,\n\
\ \"acc_stderr\": 0.03253280765271219,\n \"acc_norm\": 0.6223051208563235,\n\
\ \"acc_norm_stderr\": 0.03319508878633904,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.47938272596576464,\n\
\ \"mc2_stderr\": 0.01507589659584474\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759082,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617307309300936,\n\
\ \"acc_stderr\": 0.004850508945116088,\n \"acc_norm\": 0.8207528380800637,\n\
\ \"acc_norm_stderr\": 0.0038277525727700257\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.03550683989165581,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.03550683989165581\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.04598188057816542,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.04598188057816542\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612903,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612903\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n\
\ \"acc_stderr\": 0.014648172749593517,\n \"acc_norm\": 0.7867177522349936,\n\
\ \"acc_norm_stderr\": 0.014648172749593517\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2201117318435754,\n\
\ \"acc_stderr\": 0.013856994024227175,\n \"acc_norm\": 0.2201117318435754,\n\
\ \"acc_norm_stderr\": 0.013856994024227175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904664,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904664\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291477,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215927,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215927\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.47938272596576464,\n\
\ \"mc2_stderr\": 0.01507589659584474\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515316\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6087945413191812,\n \
\ \"acc_stderr\": 0.013442502402794302\n }\n}\n```"
repo_url: https://huggingface.co/kevin009/Llamafia
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|arc:challenge|25_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|gsm8k|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hellaswag|10_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T11-18-55.714824.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- '**/details_harness|winogrande|5_2024-01-16T11-18-55.714824.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T11-18-55.714824.parquet'
- config_name: results
data_files:
- split: 2024_01_16T11_18_55.714824
path:
- results_2024-01-16T11-18-55.714824.parquet
- split: latest
path:
- results_2024-01-16T11-18-55.714824.parquet
---
# Dataset Card for Evaluation run of kevin009/Llamafia
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/Llamafia](https://huggingface.co/kevin009/Llamafia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__Llamafia",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T11:18:55.714824](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__Llamafia/blob/main/results_2024-01-16T11-18-55.714824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6211061177095942,
"acc_stderr": 0.03253280765271219,
"acc_norm": 0.6223051208563235,
"acc_norm_stderr": 0.03319508878633904,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.47938272596576464,
"mc2_stderr": 0.01507589659584474
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759082,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.617307309300936,
"acc_stderr": 0.004850508945116088,
"acc_norm": 0.8207528380800637,
"acc_norm_stderr": 0.0038277525727700257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.03550683989165581,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.03550683989165581
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.04598188057816542,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.04598188057816542
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612903,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545843,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545843
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593517,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2201117318435754,
"acc_stderr": 0.013856994024227175,
"acc_norm": 0.2201117318435754,
"acc_norm_stderr": 0.013856994024227175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904664,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215927,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215927
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.47938272596576464,
"mc2_stderr": 0.01507589659584474
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515316
},
"harness|gsm8k|5": {
"acc": 0.6087945413191812,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gbwsolutions/autotrain-data-TopicModeling | ---
dataset_info:
features:
- name: autotrain_text
dtype: string
- name: autotrain_label
dtype:
class_label:
names:
'0': Accuracy
'1': Cleanliness
'2': Quality
'3': Service
'4': Speed
'5': Staff
'6': Technology
splits:
- name: train
num_bytes: 4913
num_examples: 40
- name: validation
num_bytes: 1862
num_examples: 10
download_size: 11484
dataset_size: 6775
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-TopicModeling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_deepnight-research__lil-c3po | ---
pretty_name: Evaluation run of deepnight-research/lil-c3po
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [deepnight-research/lil-c3po](https://huggingface.co/deepnight-research/lil-c3po)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepnight-research__lil-c3po\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T17:28:57.885828](https://huggingface.co/datasets/open-llm-leaderboard/details_deepnight-research__lil-c3po/blob/main/results_2023-12-16T17-28-57.885828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6248592823720264,\n\
\ \"acc_stderr\": 0.032934207150823985,\n \"acc_norm\": 0.627774280407218,\n\
\ \"acc_norm_stderr\": 0.03360219710155188,\n \"mc1\": 0.5238678090575275,\n\
\ \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6873119394140667,\n\
\ \"mc2_stderr\": 0.0149863398321527\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n\
\ \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.01393680921215829\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6699860585540729,\n\
\ \"acc_stderr\": 0.004692567655961763,\n \"acc_norm\": 0.8444532961561442,\n\
\ \"acc_norm_stderr\": 0.0036168436913607627\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621502,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621502\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072388,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072388\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830506,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830506\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203624,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035286,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035286\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879702,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879702\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.016712467441702517,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.016712467441702517\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438899,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438899\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n\
\ \"acc_stderr\": 0.012680037994097074,\n \"acc_norm\": 0.4406779661016949,\n\
\ \"acc_norm_stderr\": 0.012680037994097074\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6225490196078431,\n \"acc_stderr\": 0.01961085147488029,\n \
\ \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.01961085147488029\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5238678090575275,\n\
\ \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6873119394140667,\n\
\ \"mc2_stderr\": 0.0149863398321527\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987745\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4844579226686884,\n \
\ \"acc_stderr\": 0.013765829454512893\n }\n}\n```"
repo_url: https://huggingface.co/deepnight-research/lil-c3po
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|arc:challenge|25_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|gsm8k|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hellaswag|10_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-28-57.885828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T17-28-57.885828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- '**/details_harness|winogrande|5_2023-12-16T17-28-57.885828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T17-28-57.885828.parquet'
- config_name: results
data_files:
- split: 2023_12_16T17_28_57.885828
path:
- results_2023-12-16T17-28-57.885828.parquet
- split: latest
path:
- results_2023-12-16T17-28-57.885828.parquet
---
# Dataset Card for Evaluation run of deepnight-research/lil-c3po
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepnight-research/lil-c3po](https://huggingface.co/deepnight-research/lil-c3po) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepnight-research__lil-c3po",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T17:28:57.885828](https://huggingface.co/datasets/open-llm-leaderboard/details_deepnight-research__lil-c3po/blob/main/results_2023-12-16T17-28-57.885828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6248592823720264,
"acc_stderr": 0.032934207150823985,
"acc_norm": 0.627774280407218,
"acc_norm_stderr": 0.03360219710155188,
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961567,
"mc2": 0.6873119394140667,
"mc2_stderr": 0.0149863398321527
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.01393680921215829
},
"harness|hellaswag|10": {
"acc": 0.6699860585540729,
"acc_stderr": 0.004692567655961763,
"acc_norm": 0.8444532961561442,
"acc_norm_stderr": 0.0036168436913607627
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072388,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072388
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830506,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203624,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035286,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035286
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879702,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879702
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.016712467441702517,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.016712467441702517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.02641560191438899,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.02641560191438899
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.012680037994097074,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.012680037994097074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.01961085147488029,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.01961085147488029
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961567,
"mc2": 0.6873119394140667,
"mc2_stderr": 0.0149863398321527
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987745
},
"harness|gsm8k|5": {
"acc": 0.4844579226686884,
"acc_stderr": 0.013765829454512893
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
one-sec-cv12/chunk_166 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21963392208.125
num_examples: 228671
download_size: 20654079336
dataset_size: 21963392208.125
---
# Dataset Card for "chunk_166"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
astrin0321/SensorServerv1.0 | ---
license: apache-2.0
---
|
Mohamad-Jaallouk/color_dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: colorized_image
dtype: image
splits:
- name: train
num_bytes: 35868425.0
num_examples: 766
download_size: 35923024
dataset_size: 35868425.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Niraj-ML/chatbot | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 436905
num_examples: 2415
download_size: 102812
dataset_size: 436905
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kirili4ik/yandex_jobs | ---
annotations_creators:
- expert-generated
language:
- ru
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
paperswithcode_id: climate-fever
pretty_name: yandex_jobs
size_categories:
- n<1K
source_datasets:
- original
tags:
- vacancies
- jobs
- ru
- yandex
task_categories:
- text-generation
- summarization
- multiple-choice
task_ids:
- language-modeling
---
# Dataset Card for Yandex_Jobs
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
This is a dataset of more than 600 IT vacancies in Russian from parsing telegram channel https://t.me/ya_jobs. All the texts are perfectly structured, no missing values.
### Supported Tasks and Leaderboards
`text-generation` with the 'Raw text column'.
`summarization` as for getting from all the info the header.
`multiple-choice` as for the hashtags (to choose multiple from all available in the dataset)
### Languages
The text in the dataset is in only in Russian. The associated BCP-47 code is `ru`.
## Dataset Structure
### Data Instances
The data is parsed from a vacancy of Russian IT company [Yandex](https://ya.ru/).
An example from the set looks as follows:
```
{'Header': 'Разработчик интерфейсов в группу разработки спецпроектов',
'Emoji': '🎳',
'Description': 'Конструктор лендингов — это инструмент Яндекса, который позволяет пользователям создавать лендинги и турбо-лендинги для Яндекс.Директа. Турбо — режим ускоренной загрузки страниц для показа на мобильных. У нас современный стек, смелые планы и высокая динамика.\nМы ищем опытного и открытого новому фронтенд-разработчика.',
'Requirements': '• отлично знаете JavaScript
• разрабатывали на Node.js, применяли фреймворк Express
• умеете создавать веб-приложения на React + Redux
• знаете HTML и CSS, особенности их отображения в браузерах',
'Tasks': '• разрабатывать интерфейсы',
'Pluses': '• писали интеграционные, модульные, функциональные или браузерные тесты
• умеете разворачивать и администрировать веб-сервисы: собирать Docker-образы, настраивать мониторинги, выкладывать в облачные системы, отлаживать в продакшене
• работали с реляционными БД PostgreSQL',
'Hashtags': '#фронтенд #турбо #JS',
'Link': 'https://ya.cc/t/t7E3UsmVSKs6L',
'Raw text': 'Разработчик интерфейсов в группу разработки спецпроектов🎳
Конструктор лендингов — это инструмент Яндекса, который позволяет пользователям создавать лендинги и турбо-лендинги для Яндекс.Директа. Турбо — режим ускоренной загрузки страниц для показа на мобильных. У нас современный стек, смелые планы и высокая динамика.
Мы ищем опытного и открытого новому фронтенд-разработчика.
Мы ждем, что вы:
• отлично знаете JavaScript
• разрабатывали на Node.js, применяли фреймворк Express
• умеете создавать веб-приложения на React + Redux
• знаете HTML и CSS, особенности их отображения в браузерах
Что нужно делать:
• разрабатывать интерфейсы
Будет плюсом, если вы:
• писали интеграционные, модульные, функциональные или браузерные тесты
• умеете разворачивать и администрировать веб-сервисы: собирать Docker-образы, настраивать мониторинги, выкладывать в облачные системы, отлаживать в продакшене
• работали с реляционными БД PostgreSQL
https://ya.cc/t/t7E3UsmVSKs6L
#фронтенд #турбо #JS'
}
```
### Data Fields
- `Header`: A string with a position title (str)
- `Emoji`: Emoji that is used at the end of the title position (usually asosiated with the position) (str)
- `Description`: Short description of the vacancy (str)
- `Requirements`: A couple of required technologies/programming languages/experience (str)
- `Tasks`: Examples of the tasks of the job position (str)
- `Pluses`: A couple of great points for the applicant to have (technologies/experience/etc)
- `Hashtags`: A list of hashtags assosiated with the job (usually programming languages) (str)
- `Link`: A link to a job description (there may be more information, but it is not checked) (str)
- `Raw text`: Raw text with all the formatiing from the channel. Created with other fields. (str)
### Data Splits
There is not enough examples yet to split it to train/test/val in my opinion.
## Dataset Creation
It downloaded and parsed from telegram channel https://t.me/ya_jobs 03.09.2022. All the unparsed examples and the ones missing any field are deleted (from 1600 vacancies to only 600 without any missing fields like emojis or links)
## Considerations for Using the Data
These vacancies are for only one IT company (yandex). This means they can be pretty specific and probably can not be generalized as any vacancies or even any IT vacancies.
## Contributions
- **Point of Contact and Author:** [Kirill Gelvan](telegram: @kirili4ik) |
Nandini82/science-qa | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3200811
num_examples: 2922
download_size: 811357
dataset_size: 3200811
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/baba_konomi_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of baba_konomi/馬場このみ/바바코노미 (THE iDOLM@STER: Million Live!)
This is the dataset of baba_konomi/馬場このみ/바바코노미 (THE iDOLM@STER: Million Live!), containing 294 images and their tags.
The core tags of this character are `brown_hair, braid, long_hair, single_braid, hair_over_shoulder, aqua_eyes, breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 294 | 315.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/baba_konomi_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 294 | 202.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/baba_konomi_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 694 | 421.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/baba_konomi_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 294 | 292.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/baba_konomi_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 694 | 561.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/baba_konomi_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/baba_konomi_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, dress, blush, hair_ornament, one_eye_closed, open_mouth |
| 1 | 12 |  |  |  |  |  | 1girl, blush, small_breasts, solo, looking_at_viewer, micro_bikini, navel, white_bikini, smile, open_mouth |
| 2 | 9 |  |  |  |  |  | 1girl, blush, nipples, female_pubic_hair, navel, nude, spread_legs, sweat, censored, looking_at_viewer, small_breasts, 1boy, cum_in_pussy, hetero, open_mouth, penis, solo_focus, cum_on_body, lying |
| 3 | 5 |  |  |  |  |  | 1girl, blush, pleated_skirt, randoseru, short_sleeves, white_shirt, looking_at_viewer, open_mouth, serafuku, solo, suspender_skirt, black_skirt, blue_skirt, hair_between_eyes, white_background, white_sailor_collar, black_footwear, bow, collared_shirt, green_eyes, kneehighs, red_bag, shoes, simple_background, striped, sweat, table, twin_braids, white_socks |
| 4 | 6 |  |  |  |  |  | 1girl, blush, looking_at_viewer, small_breasts, solo, navel, pillow, underwear_only, black_bra, black_panties, green_eyes, lying, open_mouth |
| 5 | 9 |  |  |  |  |  | playboy_bunny, rabbit_ears, 1girl, detached_collar, wrist_cuffs, blush, bowtie, fake_animal_ears, looking_at_viewer, solo, smile, bare_shoulders, rabbit_tail, ass, fishnet_pantyhose, strapless_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | dress | blush | hair_ornament | one_eye_closed | open_mouth | small_breasts | micro_bikini | navel | white_bikini | nipples | female_pubic_hair | nude | spread_legs | sweat | censored | 1boy | cum_in_pussy | hetero | penis | solo_focus | cum_on_body | lying | pleated_skirt | randoseru | short_sleeves | white_shirt | serafuku | suspender_skirt | black_skirt | blue_skirt | hair_between_eyes | white_background | white_sailor_collar | black_footwear | bow | collared_shirt | green_eyes | kneehighs | red_bag | shoes | simple_background | striped | table | twin_braids | white_socks | pillow | underwear_only | black_bra | black_panties | playboy_bunny | rabbit_ears | detached_collar | wrist_cuffs | bowtie | fake_animal_ears | bare_shoulders | rabbit_tail | ass | fishnet_pantyhose | strapless_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:--------|:--------|:----------------|:-----------------|:-------------|:----------------|:---------------|:--------|:---------------|:----------|:--------------------|:-------|:--------------|:--------|:-----------|:-------|:---------------|:---------|:--------|:-------------|:--------------|:--------|:----------------|:------------|:----------------|:--------------|:-----------|:------------------|:--------------|:-------------|:--------------------|:-------------------|:----------------------|:-----------------|:------|:-----------------|:-------------|:------------|:----------|:--------|:--------------------|:----------|:--------|:--------------|:--------------|:---------|:-----------------|:------------|:----------------|:----------------|:--------------|:------------------|:--------------|:---------|:-------------------|:-----------------|:--------------|:------|:--------------------|:--------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | | | X | | | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | X | | | X | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | | X | | | X | X | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
tyzhu/find_second_sent_train_100_eval_20 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 281556
num_examples: 220
- name: validation
num_bytes: 20165
num_examples: 20
download_size: 156440
dataset_size: 301721
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_second_sent_train_100_eval_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distantquant/openbohm | ---
license: cc-by-4.0
language:
- en
tags:
- multi-turn
- philosophy
- long-form
pretty_name: openBohm
size_categories:
- n<1K
---
## OpenBohm
This dataset is an experimental conjugation of philosophical multi-turn long-form conversations from J. Krishnamurti, and D. Bohm, added to long-conversation filtered (count > 6) Capybara data, edited to be slightly less apologetic.
Removed references to names and locations where possible. Some conversations have been paraphrased somewhat to follow QA format better, however they keep the key content of the original.
 |
nateraw/sync_food101 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
pretty_name: food101
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-foodspotting
task_categories:
- other
task_ids:
- other-other-image-classification
paperswithcode_id: food-101
---
# Dataset Card for Food-101
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[Food-101 Dataset](https://data.vision.ee.ethz.ch/cvl/datasets_extra/food-101/)
- **Repository:** N/A
- **Paper:**[Paper](https://data.vision.ee.ethz.ch/cvl/datasets_extra/food-101/static/bossard_eccv14_food-101.pdf)
- **Leaderboard:** N/A
- **Point of Contact:** N/A
### Dataset Summary
This dataset consists of 101 food categories, with 101'000 images. For each class, 250 manually reviewed test images are provided as well as 750 training images. On purpose, the training images were not cleaned, and thus still contain some amount of noise. This comes mostly in the form of intense colors and sometimes wrong labels. All images were rescaled to have a maximum side length of 512 pixels.
### Supported Tasks and Leaderboards
- image-classification
### Languages
English
## Dataset Structure
### Data Instances
A sample from the training set is provided below:
```
{
'image': '/root/.cache/huggingface/datasets/downloads/extracted/6e1e8c9052e9f3f7ecbcb4b90860668f81c1d36d86cc9606d49066f8da8bfb4f/food-101/images/churros/1004234.jpg',
'label': 23
}
```
### Data Fields
The data instances have the following fields:
- `image`: a `string` filepath to an image.
- `label`: an `int` classification label.
### Data Splits
| name |train|validation|
|----------|----:|---------:|
|food101|75750|25250|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{bossard14,
title = {Food-101 -- Mining Discriminative Components with Random Forests},
author = {Bossard, Lukas and Guillaumin, Matthieu and Van Gool, Luc},
booktitle = {European Conference on Computer Vision},
year = {2014}
}
```
### Contributions
Thanks to [@nateraw](https://github.com/nateraw) for adding this dataset.
|
kpriyanshu256/semeval-multi-test | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 99468808
num_examples: 42378
download_size: 58558248
dataset_size: 99468808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/random_letter_same_length_find_passage_train100_eval10_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 66032
num_examples: 210
- name: validation
num_bytes: 3378
num_examples: 10
download_size: 34114
dataset_size: 69410
---
# Dataset Card for "random_letter_same_length_find_passage_train100_eval10_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sarina_shizukume_mahoushoujosite | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sarina Shizukume/雫芽さりな (Mahou Shoujo Site)
This is the dataset of Sarina Shizukume/雫芽さりな (Mahou Shoujo Site), containing 231 images and their tags.
The core tags of this character are `red_hair, twintails, red_eyes, low_twintails, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 231 | 111.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarina_shizukume_mahoushoujosite/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 231 | 110.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarina_shizukume_mahoushoujosite/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 360 | 170.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarina_shizukume_mahoushoujosite/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sarina_shizukume_mahoushoujosite',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, cloak, solo, looking_at_viewer, closed_mouth, hood_up, cloud, sky, upper_body, dark, standing |
| 1 | 10 |  |  |  |  |  | 1girl, solo, profile, closed_mouth, portrait, anime_coloring, scar, from_side, smile |
| 2 | 14 |  |  |  |  |  | 1girl, serafuku, solo, open_mouth, anime_coloring, multicolored_hair |
| 3 | 7 |  |  |  |  |  | black_shirt, locker, serafuku, smile, upper_body, white_sailor_collar, 1girl, short_sleeves, open_mouth, solo, indoors |
| 4 | 14 |  |  |  |  |  | 1girl, solo, serafuku, clenched_teeth, dark, upper_body, short_sleeves, brown_hair, angry, anime_coloring, looking_at_viewer |
| 5 | 8 |  |  |  |  |  | 1girl, hair_over_shoulder, open_mouth, solo, anime_coloring, fang, hair_scrunchie, bandaged_neck, brown_hair, brown_eyes, dark, sweatdrop, upper_body |
| 6 | 7 |  |  |  |  |  | 1girl, green_shirt, solo, bandages, collarbone, multicolored_hair, fang, bra_strap, orange_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cloak | solo | looking_at_viewer | closed_mouth | hood_up | cloud | sky | upper_body | dark | standing | profile | portrait | anime_coloring | scar | from_side | smile | serafuku | open_mouth | multicolored_hair | black_shirt | locker | white_sailor_collar | short_sleeves | indoors | clenched_teeth | brown_hair | angry | hair_over_shoulder | fang | hair_scrunchie | bandaged_neck | brown_eyes | sweatdrop | green_shirt | bandages | collarbone | bra_strap | orange_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:---------------|:----------|:--------|:------|:-------------|:-------|:-----------|:----------|:-----------|:-----------------|:-------|:------------|:--------|:-----------|:-------------|:--------------------|:--------------|:---------|:----------------------|:----------------|:----------|:-----------------|:-------------|:--------|:---------------------|:-------|:-----------------|:----------------|:-------------|:------------|:--------------|:-----------|:-------------|:------------|:--------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | X | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | X | | | | | | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | | | | | | X | | | | | | | | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | | X | X | | | | | X | X | | | | X | | | | X | | | | | | X | | X | X | X | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | | | | | | X | X | | | | X | | | | | X | | | | | | | | X | | X | X | X | X | X | X | | | | | |
| 6 | 7 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | X | X | X | X |
|
open-llm-leaderboard/details_psmathur__test_42_70b | ---
pretty_name: Evaluation run of psmathur/test_42_70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/test_42_70b](https://huggingface.co/psmathur/test_42_70b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__test_42_70b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T08:14:38.218715](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__test_42_70b_public/blob/main/results_2023-11-07T08-14-38.218715.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08095637583892618,\n\
\ \"em_stderr\": 0.0027934007378494835,\n \"f1\": 0.14089450503355697,\n\
\ \"f1_stderr\": 0.002922494704077647,\n \"acc\": 0.6480304552550813,\n\
\ \"acc_stderr\": 0.012058894490351774\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08095637583892618,\n \"em_stderr\": 0.0027934007378494835,\n\
\ \"f1\": 0.14089450503355697,\n \"f1_stderr\": 0.002922494704077647\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \
\ \"acc_stderr\": 0.013727093010429786\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.01039069597027376\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/test_42_70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T10_37_53.854467
path:
- '**/details_harness|drop|3_2023-11-05T10-37-53.854467.parquet'
- split: 2023_11_07T08_14_38.218715
path:
- '**/details_harness|drop|3_2023-11-07T08-14-38.218715.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T08-14-38.218715.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T10_37_53.854467
path:
- '**/details_harness|gsm8k|5_2023-11-05T10-37-53.854467.parquet'
- split: 2023_11_07T08_14_38.218715
path:
- '**/details_harness|gsm8k|5_2023-11-07T08-14-38.218715.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T08-14-38.218715.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T10_37_53.854467
path:
- '**/details_harness|winogrande|5_2023-11-05T10-37-53.854467.parquet'
- split: 2023_11_07T08_14_38.218715
path:
- '**/details_harness|winogrande|5_2023-11-07T08-14-38.218715.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T08-14-38.218715.parquet'
- config_name: results
data_files:
- split: 2023_11_05T10_37_53.854467
path:
- results_2023-11-05T10-37-53.854467.parquet
- split: 2023_11_07T08_14_38.218715
path:
- results_2023-11-07T08-14-38.218715.parquet
- split: latest
path:
- results_2023-11-07T08-14-38.218715.parquet
---
# Dataset Card for Evaluation run of psmathur/test_42_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/test_42_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/test_42_70b](https://huggingface.co/psmathur/test_42_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__test_42_70b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T08:14:38.218715](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__test_42_70b_public/blob/main/results_2023-11-07T08-14-38.218715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08095637583892618,
"em_stderr": 0.0027934007378494835,
"f1": 0.14089450503355697,
"f1_stderr": 0.002922494704077647,
"acc": 0.6480304552550813,
"acc_stderr": 0.012058894490351774
},
"harness|drop|3": {
"em": 0.08095637583892618,
"em_stderr": 0.0027934007378494835,
"f1": 0.14089450503355697,
"f1_stderr": 0.002922494704077647
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.013727093010429786
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.01039069597027376
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_137 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1161918020.0
num_examples: 228185
download_size: 1186267476
dataset_size: 1161918020.0
---
# Dataset Card for "chunk_137"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amaye15/receipts | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype:
class_label:
names:
'0': Barcode
'1': Invoice
'2': Object
'3': Receipt
'4': Non-Object
splits:
- name: train
num_bytes: 11235048033.047714
num_examples: 10200
- name: test
num_bytes: 3584195339.875286
num_examples: 2551
download_size: 15096285024
dataset_size: 14819243372.923
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yzhuang/autotree_automl_default-of-credit-card-clients_sgosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 308080000
num_examples: 10000
- name: validation
num_bytes: 308080000
num_examples: 10000
download_size: 181794530
dataset_size: 616160000
---
# Dataset Card for "autotree_automl_default-of-credit-card-clients_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tankado/Dutch_Census | ---
size_categories:
- 10K<n<100K
--- |
thanujaifin/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FinchResearch/guanaco-extended | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- langauge
- comprehensive
- synthetic
- llm
- nlp
pretty_name: guanaco-extended
size_categories:
- 1M<n<10M
---
# Hugging Face Dataset Card: Amoeba Mixed AI-Human Generated Samples
## Overview
Amoeba Mixed AI-Human Generated Samples is a massive dataset that contains a diverse collection of text samples generated by both AI models and human authors. With a size exceeding 13 GB, this dataset is designed to foster research and development in the field of natural language generation and understanding.

## Dataset Description
- Name: Amoeba Mixed AI-Human Generated Samples
- Size: Over 13 GB
- Split: Single split
- License: Open-source (Creative Commons License)
- Dataset Homepage: https://example.com/amoeba-dataset
## Intended Use
The Amoeba Mixed AI-Human Generated Samples dataset is intended for various natural language processing (NLP) tasks, including but not limited to:
- Text generation
- Language modeling
- Text classification
- Sentiment analysis
- Language translation
- Text summarization
## Data Sources
The dataset comprises a blend of AI-generated samples from the state-of-the-art language model, "Amoeba," and manually curated human-generated samples from diverse sources. By combining AI and human contributions, the dataset ensures a rich and varied distribution of language patterns and styles.
## Data Format
The data is provided in plain text format, with one sample per line. Each sample represents a unique text snippet that can range from a few words to full sentences.
Example:
|
varaslaw/KaggleRVC2 | ---
license: openrail
---
|
EgilKarlsen/PKDD_BERT_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115608885
num_examples: 37500
- name: test
num_bytes: 38536331
num_examples: 12500
download_size: 211882193
dataset_size: 154145216
---
# Dataset Card for "PKDD_BERT_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/msmarco-passage_train_triples-v2 | ---
pretty_name: '`msmarco-passage/train/triples-v2`'
viewer: false
source_datasets: ['irds/msmarco-passage']
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-passage/train/triples-v2`
The `msmarco-passage/train/triples-v2` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-passage#msmarco-passage/train/triples-v2).
# Data
This dataset provides:
- `docpairs`; count=397,768,673
- For `docs`, use [`irds/msmarco-passage`](https://huggingface.co/datasets/irds/msmarco-passage)
## Usage
```python
from datasets import load_dataset
docpairs = load_dataset('irds/msmarco-passage_train_triples-v2', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_36 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 488484288.0
num_examples: 95184
download_size: 500378923
dataset_size: 488484288.0
---
# Dataset Card for "chunk_36"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_wandb__gemma-7b-zephyr-sft | ---
pretty_name: Evaluation run of wandb/gemma-7b-zephyr-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wandb/gemma-7b-zephyr-sft](https://huggingface.co/wandb/gemma-7b-zephyr-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wandb__gemma-7b-zephyr-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T00:37:56.432302](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__gemma-7b-zephyr-sft/blob/main/results_2024-03-01T00-37-56.432302.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6033367728065783,\n\
\ \"acc_stderr\": 0.032882594243367605,\n \"acc_norm\": 0.606958636660939,\n\
\ \"acc_norm_stderr\": 0.03353521541447871,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4334943445434731,\n\
\ \"mc2_stderr\": 0.014653429969235831\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.6143344709897611,\n \"acc_norm_stderr\": 0.014224250973257182\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6063533160724955,\n\
\ \"acc_stderr\": 0.004875595792850676,\n \"acc_norm\": 0.8073093009360686,\n\
\ \"acc_norm_stderr\": 0.003936061455151114\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4603174603174603,\n \"acc_stderr\": 0.025670080636909186,\n \"\
acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.025670080636909186\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7354838709677419,\n \"acc_stderr\": 0.025091892378859275,\n \"\
acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.025091892378859275\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.034953345821629324,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.034953345821629324\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.03210479051015776,\n \
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.017604304149256476,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.017604304149256476\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156214,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156214\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281355,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281355\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.01442229220480884,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.01442229220480884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281416,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281416\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087371,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087371\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882537,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4334943445434731,\n\
\ \"mc2_stderr\": 0.014653429969235831\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4981046247156937,\n \
\ \"acc_stderr\": 0.013772385765569753\n }\n}\n```"
repo_url: https://huggingface.co/wandb/gemma-7b-zephyr-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-37-56.432302.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-37-56.432302.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- '**/details_harness|winogrande|5_2024-03-01T00-37-56.432302.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T00-37-56.432302.parquet'
- config_name: results
data_files:
- split: 2024_03_01T00_37_56.432302
path:
- results_2024-03-01T00-37-56.432302.parquet
- split: latest
path:
- results_2024-03-01T00-37-56.432302.parquet
---
# Dataset Card for Evaluation run of wandb/gemma-7b-zephyr-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wandb/gemma-7b-zephyr-sft](https://huggingface.co/wandb/gemma-7b-zephyr-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wandb__gemma-7b-zephyr-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T00:37:56.432302](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__gemma-7b-zephyr-sft/blob/main/results_2024-03-01T00-37-56.432302.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6033367728065783,
"acc_stderr": 0.032882594243367605,
"acc_norm": 0.606958636660939,
"acc_norm_stderr": 0.03353521541447871,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132301,
"mc2": 0.4334943445434731,
"mc2_stderr": 0.014653429969235831
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.6143344709897611,
"acc_norm_stderr": 0.014224250973257182
},
"harness|hellaswag|10": {
"acc": 0.6063533160724955,
"acc_stderr": 0.004875595792850676,
"acc_norm": 0.8073093009360686,
"acc_norm_stderr": 0.003936061455151114
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.025670080636909186,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.025670080636909186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.034953345821629324,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.034953345821629324
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.017604304149256476,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.017604304149256476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156214,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156214
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281355,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281355
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.01442229220480884,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.01442229220480884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281416,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281416
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419994,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087371,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087371
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882537,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132301,
"mc2": 0.4334943445434731,
"mc2_stderr": 0.014653429969235831
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972384
},
"harness|gsm8k|5": {
"acc": 0.4981046247156937,
"acc_stderr": 0.013772385765569753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cakiki/sql_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 35050567
num_examples: 1267490
download_size: 23626806
dataset_size: 35050567
---
# Dataset Card for "sql_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Logic123456789/Luotuo-QA-B | ---
extra_gated_prompt: 我们制作了luotuo-QA-B数据集,请仔细阅读Licensing Information部分的信息。
extra_gated_heading: "您需要接受协议并提交信息以获取此数据集"
extra_gated_fields:
姓名: text
邮箱: text
所在组织: text
使用目的: text
我同意仅将此数据集用于非商业用途: checkbox
extra_gated_button_content: "我已阅读协议并同意提供相关信息"
license: other
task_categories:
- question-answering
language:
- zh
- en
---
# Dataset Card for luotuo-QA-B
## Dataset Description
- **Homepage:** https://github.com/LC1332/Luotuo-Chinese-LLM
- **Repository:** https://github.com/LC1332/Luotuo-QA
- **Point of Contact:** qinyu_luo@163.com
### Dataset Summary
Anki_Card是一种用于记忆和学习的电子卡片系统。我们建立了一个类似于这种形式的问答数据集,旨在推动中英文语境下问答模型的研究和发展。
我们的数据集是在3个开源数据集之上生成构建的,这3个数据集分别是:
·Chinese Scientific Literature Dataset
·CNN-DailyMail News Text Summarization
·arXiv Dataset
您可以直接搜索这些原始数据集的名称或是从以下链接访问它们
·https://github.com/ydli-ai/CSL
·https://www.kaggle.com/datasets/gowrishankarp/newspaper-text-summarization-cnn-dailymail
·https://www.kaggle.com/datasets/Cornell-University/arxiv
我们在这些数据集的基础上针对每一个摘要或新闻生成了5个“问题-答案”对。数据分布如下:
---从Chinese Scientific Literature Dataset(CSL)数据集中生成了25836条中文数据,共129180个问答对。
---从CNN-DailyMail News Text Summarization数据集中生成了2026条数据,共10130个问答对。
---从arXiv Dataset数据集中生成了3602条英文数据,共18010个问答对。
此外,由于此数据集是我们Luotuo-QA项目的一部分,我们将它叫做luotuo-QA-B。
您可以在这里查看Luotuo-QA项目:https://github.com/LC1332/Luotuo-QA
此数据集适用于训练和评估中文对话式问答模型。有益于推动中文自然语言处理领域的发展,同时也为研究人员和开发者提供了一个基准,用于比较不同模型的性能和探索新的方法。
我们希望这一工作能够促进全球范围内中文语境对话式问答任务的研究和进一步的创新。
-----------------------------------------------------------------------------------------------------------------------------------------------
Anki_Card is an electronic flashcard system used for memory and learning. We have created a question-and-answer dataset in a similar format to facilitate research and development of question-answering models in both Chinese and English contexts.
Our dataset is constructed based on three open-source datasets:
·Chinese Scientific Literature Dataset
·CNN-DailyMail News Text Summarization
·arXiv Dataset
You can directly search for the names of these original datasets or access them from the following links:
·Chinese Scientific Literature Dataset (CSL): https://github.com/ydli-ai/CSL
·CNN-DailyMail News Text Summarization: https://www.kaggle.com/datasets/gowrishankarp/newspaper-text-summarization-cnn-dailymail
·arXiv Dataset: https://www.kaggle.com/datasets/Cornell-University/arxiv
Based on these datasets, we have generated five "question-answer" pairs for each summary or news article. The data distribution is as follows:
---From the Chinese Scientific Literature Dataset (CSL), we generated 25,836 Chinese data points, resulting in a total of 129,180 question-answer pairs.
---From the CNN-DailyMail News Text Summarization dataset, we generated 2,026 data points, resulting in a total of 10,130 question-answer pairs.
---From the arXiv Dataset, we generated 3,602 English data points, resulting in a total of 18,010 question-answer pairs.
Furthermore, as this dataset is part of our Luotuo-QA project, we refer to it as luotuo-QA-B.
You can find the Luotuo-QA project here: https://github.com/LC1332/Luotuo-QA
This dataset is suitable for training and evaluating Chinese conversational question-answering models. It contributes to the development of Chinese natural language processing and provides researchers and developers with a benchmark for comparing the performance of different models and exploring new approaches.
We hope that this work will promote research and further innovation in Chinese conversational question-answering tasks on a global scale.
### Languages
CHINESE, ENGLISH
### Data Instances
中文数据样例:
```
{
"story": "中国股市发展中特有的股权分置结构决定了研究股市收益率问题的复杂性.本文提出用全收益率的标准来衡量中国股市的整体收益率,认为在股权分置及其逐步解决的过程中,研究股市全收益率具有重要的意义,也是讨论股市其它问题的理论基础.随着股权分置改革渐进式地推进,中国股市各类股权所有者的收益分布会发生显著的结构性变化.从长期看,股权分置改革能使投资股东和原始股东的收益函数趋于一致,有助于实现整体收益的最大化.",
"questions": [
"为什么研究股市收益率问题复杂?",
"用什么标准来衡量中国股市的整体收益率?",
"股权分置改革对股东收益分布会有什么影响?",
"股权分置改革的推进方式是什么?",
"为什么研究股市全收益率具有重要意义?"
],
"answers": [
"因为中国股市发展中特有的股权分置结构决定了研究股市收益率问题的复杂性。",
"用全收益率的标准来衡量中国股市的整体收益率。",
"股权分置改革会使投资股东和原始股东的收益函数趋于一致,有助于实现整体收益的最大化。",
"股权分置改革是渐进式地推进的。",
"因为研究股市全收益率是讨论股市其它问题的理论基础,也在股权分置及其逐步解决的过程中具有重要的意义。"
],
"language": "Chinese"
}
```
英文数据样例:
```
{
"story": "'(CNN) -- A 14-year-old was arrested late Tuesday after shining a powerful laser light into the eyes of a pilot who was approaching Los Angeles International Airport, the Federal Aviation Administration said. The arrest puts a spotlight on what the FAA calls a dangerous problem in recent years. In Tuesday's case, the pilot was about 2,000 feet in the air and nobody was hurt in the incident, said Ian Gregor, an FAA spokesman. \"It's potentially very dangerous to shine a laser at an aircraft because a laser can distract a pilot and there have been cases where pilots have suffered temporary vision problems as a result of being struck by a laser beam,\" Gregor said. \" We've had reports of pilots having to turn over control of the aircraft to a co-pilot or had to abort landing.\" Gregor said Los Angeles International Airport has had many instances of laser attacks. \"Pilots reported 102 laser incidents around LAX in 2010. Most of any airport in the country,\" Gregor said.'",
"questions": [
"What happened to the 14-year-old?",
"Why is shining a laser at an aircraft dangerous?",
"What have pilots had to do in some cases of laser attacks?",
"How many laser incidents were reported around LAX in 2010?",
"What is the FAA's concern about laser attacks?"
],
"answers": [
"The 14-year-old was arrested for shining a powerful laser light into the eyes of a pilot.",
"Shining a laser at an aircraft is dangerous because it can distract a pilot and cause temporary vision problems.",
"In some cases of laser attacks, pilots have had to turn over control of the aircraft to a co-pilot or had to abort landing.",
"102 laser incidents were reported around LAX in 2010, the most of any airport in the country.",
"The FAA is concerned about laser attacks because they pose a dangerous problem for pilots and can cause temporary vision problems."
],
"language": "English"
}
```
### Licensing Information
我们的协议与三个原始数据集的协议保持一致,请阅读以下内容。
·CSL数据集的协议是Apache License 2.0,除非遵守许可证,否则您不得使用此文件
·CNN-DailyMail News Text Summarization数据集的协议是 CC0: Public Domain
·arXiv数据集的协议是 CC0: Public Domain
-----------------------------------------------------------------------------------------------------------------------------------------------
Our agreements are consistent with the agreements of three original datasets. Please read the following information.
· The protocol for the CSL dataset is Apache License 2.0. You are not allowed to use this file unless you comply with the license.
· The protocol for the CNN-DailyMail News Text Summarization dataset is CC0: Public Domain.
· The protocol for the arXiv dataset is CC0: Public Domain.
### Citation Information
如果您在项目中使用了我们的模型、代码或者数据,请引用我们。
Please cite us if you use the data or code in this repo.
```bibtex
@misc{alpaca,
author={Jianshen Liao, Ao Sun, Qinyu Luo, Hongsen Huang, Cheng Li},
title = {Luotuo-QA: Better Conversational Question Answering Model with Answer Completion},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/LC1332/Luotuo-QA}},
}
```
|
Thaweewat/instruct-qa-thai-combined | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 77970280
num_examples: 41740
download_size: 17214030
dataset_size: 77970280
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- th
size_categories:
- 10K<n<100K
---
### Instruct-QA-Thai-Combined
#### Dataset Description
**Instruct-QA-Thai-Combined** is a rich collection of question-answering datasets compiled from various sources,\
tailored for the development and benchmarking of Thai language question-answering systems.
#### Languages
The primary language present in this dataset is Thai.
### Data Structure
#### Data Sources:
- **Wiki QA:** 17,000 question-answer pairs.
- **MKQA:** 10,000 queries sampled from the Google Natural Questions dataset.
- **iapp-wiki-qa-squad:** 5,761 document/question pairs.
- **ThaiQA Squad:** 4,000 question-answer pairs from Thai Wikipedia.
- **TyDi QA:** 3,789 question-answer pairs in diverse languages.
- **XQuAD:** 1,190 question-answer pairs for cross-lingual question answering performance.
#### Features:
The dataset includes the following features:
- **Instuction**: A question related to a specific topic.
- **Input**: Long form context.
- **Answer**: The corresponding answer to the question.
- **Source**: The dataset each question and answer pair originated from.
#### Source Data
The source data includes multiple datasets, each with its licensing:
- XQuAD (CC BY-SA 4.0)
- Thai QA (CC BY-SA-NC 3.0)
- TyDi QA (Apache-2.0 License)
- iapp-wiki-qa-dataset (MIT License)
- MKQA (Apple)
#### Citation Information
Please cite the following if you use the Instruct-QA-Thai-Combined dataset in your work:
```
@misc{instruct-qa-thai-combined,
author = {Thaweewat},
title = {Instruct-QA-Thai-Combined: A Comprehensive Thai Question Answering Dataset},
year = {2024},
publisher = {Hugging Face},
journal = {Hugging Face Datasets},
howpublished = {url{https://huggingface.co/datasets/Thaweewat/instruct-qa-thai-combined}}
}
```
### Acknowledgements
Special thanks to the contributors of the original datasets: NECTEC, DeepMind, Google Research, and Apple, among others.
--- |
Jay-Rajput/DIS_IPL_Preds | ---
configs:
- config_name: predictions
data_files: predictions/*.json
---
---
license: apache-2.0
---
|
senhorsapo/Echidna | ---
license: openrail
---
|
teelinsan/camoscio_cleaned | ---
language: it
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 20903457.244625207
num_examples: 50245
download_size: 13083590
dataset_size: 20903457.244625207
---
# Dataset Card for "camoscio_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RealTimeData/github_july_week1_2023 | ---
dataset_info:
features:
- name: full_name
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: readme
dtype: string
splits:
- name: train
num_bytes: 5834244
num_examples: 951
download_size: 2894683
dataset_size: 5834244
---
# Dataset Card for "github_july_week1_2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZMaxwell-Smith/OIL | ---
license: cc-by-nc-nd-4.0
---
__The Online Indonesian Learning (OIL) Dataset__
The Online Indonesian Learning (OIL) dataset or corpus currently contains lessons from three Indonesian teachers who have posted content on YouTube.
For further details please see Zara Maxwell-Smith and Ben Foley, (forthcoming), Automated speech recognition of Indonesian-English language lessons on YouTube using transfer learning, Field Matters Workshop, EACL 2023
How to cite this dataset.
Please use the following .bib to reference this work.
```
{@inproceedings{Maxwell-Smith_Foley_2023_Automated,
title={{Automated speech recognition of Indonesian-English language lessons on YouTube using transfer learning}},
author={Maxwell-Smith, Zara and Foley, Ben},
booktitle={Proceedings of the {Second Workshop on NLP Applications to Field Linguistics (EACL)}},
pages={},
year={forthcoming}
}
```
To stream the videos of these teachers please visit:
Indonesian Language for Beginners - https://www.youtube.com/@learningindonesianlanguage3334
5-Minute Indonesian - https://www.youtube.com/@5-minuteindonesian534/featured
Dua Budaya - https://www.youtube.com/@DuaBudaya/about
Copies of some lessons on these channels are available as part of this dataset in mp4 and wav formats.
A select number of lessons have matching ELAN files with human and human/machine generated orthographic transcriptions of the audio, as well 'tiers' with machine inference only.
Detailed information about the audio quality, remarks on background noise, code-switching behaviour and lesson content is available in the paper above.
Almost all videos contain a mix of languages, with some dominated by Indonesian or English.
Some videos explicitly focused on variation in Indonesian or words from other languages which are commonly mixed into Indonesian by speakers.
|
liuyanchen1015/MULTI_VALUE_cola_me_us | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1585
num_examples: 22
- name: test
num_bytes: 1737
num_examples: 24
- name: train
num_bytes: 17342
num_examples: 224
download_size: 15369
dataset_size: 20664
---
# Dataset Card for "MULTI_VALUE_cola_me_us"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bibinsee/maime | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1350
num_examples: 4
download_size: 3575
dataset_size: 1350
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
craigslist_bargains | ---
annotations_creators:
- machine-generated
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- dialogue-modeling
paperswithcode_id: craigslistbargains
pretty_name: CraigslistBargains
dataset_info:
features:
- name: agent_info
sequence:
- name: Bottomline
dtype: string
- name: Role
dtype: string
- name: Target
dtype: float32
- name: agent_turn
sequence: int32
- name: dialogue_acts
sequence:
- name: intent
dtype: string
- name: price
dtype: float32
- name: utterance
sequence: string
- name: items
sequence:
- name: Category
dtype: string
- name: Images
dtype: string
- name: Price
dtype: float32
- name: Description
dtype: string
- name: Title
dtype: string
splits:
- name: train
num_bytes: 8538836
num_examples: 5247
- name: test
num_bytes: 1353933
num_examples: 838
- name: validation
num_bytes: 966032
num_examples: 597
download_size: 25373618
dataset_size: 10858801
---
# Dataset Card for CraigslistBargains
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Decoupling Strategy and Generation in Negotiation Dialogues](https://worksheets.codalab.org/worksheets/0x453913e76b65495d8b9730d41c7e0a0c/)
- **Repository:** [Github: Stanford NLP Cocoa](https://github.com/stanfordnlp/cocoa/tree/master)
- **Paper:** [Decoupling Strategy and Generation in Negotiation Dialogues](https://arxiv.org/abs/1808.09637)
- **Leaderboard:** []()
- **Point of Contact:** [He He](hehe@cs.nyu.edu)
### Dataset Summary
We study negotiation dialogues where two agents, a buyer and a seller, negotiate over the price of an time for sale. We collected a dataset of more than 6K negotiation dialogues over multiple categories of products scraped from Craigslist. Our goal is to develop an agent that negotiates with humans through such conversations. The challenge is to handle both the negotiation strategy and the rich language for bargaining. To this end, we develop a modular framework which separates strategy learning from language generation. Specifically, we learn strategies in a coarse dialogue act space and instantiate that into utterances conditioned on dialogue history.
### Supported Tasks and Leaderboards
### Languages
This dataset is English
## Dataset Structure
### Data Instances
```
{
'agent_info': {
'Bottomline':
[
'None',
'None'
],
'Role':
[
'buyer',
'seller'
],
'Target':
[
7.0,
10.0
]
},
'agent_turn':
[
0,
1,
...
],
'dialogue_acts': {
'intent':
[
'init-price',
'unknown',
...
],
'price':
[
5.0,
-1.0,
...
]
},
'items': {
'Category':
[
'phone',
'phone'
],
'Description':
[
'Charge two devices simultaneously on the go...,
...
],
'Images':
[
'phone/6149527852_0.jpg',
'phone/6149527852_0.jpg'
],
'Price':
[
10.0,
10.0
],
'Title':
[
'Verizon Car Charger with Dual Output Micro USB and ...',
...
]
},
'utterance':
[
'Hi, not sure if the charger would work for my car...'
'It will work...',
...
]
}
```
### Data Fields
- `agent_info`: Information about each of the agents taking part in the dialogue
- `Bottomline`: TBD
- `Role`: Whether the agent is buyer or seller
- `Target`: Target price that the buyer/seller wants to hit in the negotiation
- `agent_turn`: Agent taking the current turn in the dialogue (`int` index corresponding to `Role` above)
- `dialogue_acts`: Rules-based information about the strategy of each agent for each turn
- `intent`: The intent of the agent at the particular turn (offer, accept, etc.)
- `price`: The current item price associated with the intent and turn in the bargaining process. Default value for missing: (`-1`)
- `items`: Information about the item the agents are bargaining for. **Note that there is an elembet for each of the fields below for each agent**
- `Category`: Category of the item
- `Description`: Description(s) of the item
- `Images`: (comma delimited) strings of image names of the item
- `Price`: Price(s) of the item. Default value for missing: (`-1`)
- `Title`: Title(s) of the item
- `utterance`: Utterance for each turn in the dialogue, corresponding to the agent in `agent_turns`. The utterance may be an empty string (`''`) for some turns if multiple dialogue acts take place after an utterance (e.g. there are often multiple dialogue acts associated with the closing of the bargaining process after all utterances have completed to describe the conclusion of the bargaining).
### Data Splits
This dataset contains three splits, `train`, `validation` and `test`. Note that `test` is not provided with `dialogue_acts` information as described above. To ensure schema consistency across dataset splits, the `dialogue_acts` field in the `test` split is populated with the default values: `{"price": -1.0, "intent": ""}`
The counts of examples in each split are as follows:
| | Train | Valid | Test |
| Input Examples | 5247 | 597 | 838 |
| Average Dialogue Length | 9.14 | 9.17 | 9.24 |
Note that
## Dataset Creation
From the [source paper](https://arxiv.org/pdf/1808.09637.pdf) for this dataset:
> To generate the negotiation scenarios, we
> scraped postings on sfbay.craigslist.org
> from the 6 most popular categories (housing, furniture, cars, bikes, phones, and electronics). Each
> posting produces three scenarios with the buyer’s
> target prices at 0.5x, 0.7x and 0.9x of the listing
> price. Statistics of the scenarios are shown in Table 2.
> We collected 6682 human-human dialogues on
> AMT using the interface shown in Appendix A
> Figure 2. The dataset statistics in Table 3 show
> that CRAIGSLISTBARGAIN has longer dialogues
> and more diverse utterances compared to prior
> datasets. Furthermore, workers were encouraged
> to embellish the item and negotiate side offers
> such as free delivery or pick-up. This highly relatable scenario leads to richer dialogues such as
> the one shown in Table 1. We also observed various persuasion techniques listed in Table 4 such as
> embellishment,
### Curation Rationale
See **Dataset Creation**
### Source Data
See **Dataset Creation**
#### Initial Data Collection and Normalization
See **Dataset Creation**
#### Who are the source language producers?
See **Dataset Creation**
### Annotations
If the dataset contains annotations which are not part of the initial data collection, describe them in the following paragraphs.
#### Annotation process
Annotations for the `dialogue_acts` in `train` and `test` were generated via a rules-based system which can be found in [this script](https://github.com/stanfordnlp/cocoa/blob/master/craigslistbargain/parse_dialogue.py)
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
[More Information Needed]
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
[More Information Needed]
### Dataset Curators
He He and Derek Chen and Anusha Balakrishnan and Percy Liang
Computer Science Department, Stanford University
`{hehe,derekchen14,anusha,pliang}@cs.stanford.edu`
The work through which this data was produced was supported by
DARPA Communicating with Computers (CwC)
program under ARO prime contract no. W911NF15-1-0462
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{he2018decoupling,
title={Decoupling Strategy and Generation in Negotiation Dialogues},
author={He He and Derek Chen and Anusha Balakrishnan and Percy Liang},
year={2018},
eprint={1808.09637},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@ZacharySBrown](https://github.com/ZacharySBrown) for adding this dataset. |
tyzhu/fw_squad_num_train_10000_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2875213
num_examples: 20100
- name: train_doc2id
num_bytes: 1736063
num_examples: 10100
- name: train_id2doc
num_bytes: 1766363
num_examples: 10100
- name: train_find_word
num_bytes: 1108850
num_examples: 10000
- name: eval_find_word
num_bytes: 10806
num_examples: 100
download_size: 3625030
dataset_size: 7497295
---
# Dataset Card for "fw_squad_num_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/victorious_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of victorious/ヴィクトリアス/胜利 (Azur Lane)
This is the dataset of victorious/ヴィクトリアス/胜利 (Azur Lane), containing 254 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, blue_eyes, large_breasts, laurel_crown, hair_ornament, very_long_hair, bangs, ribbon, hair_ribbon, wrist_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 254 | 389.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/victorious_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 254 | 230.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/victorious_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 624 | 477.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/victorious_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 254 | 351.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/victorious_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 624 | 665.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/victorious_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/victorious_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, black_panties, cleavage, looking_at_viewer, navel, solo, black_thighhighs, chain, flower_ornament, sleeveless_dress, veil, cowboy_shot, blush, floating_hair, holding_staff, standing, blue_rose, open_mouth, :d, stomach, turret |
| 1 | 12 |  |  |  |  |  | 1girl, china_dress, cleavage, looking_at_viewer, white_dress, center_opening, solo, black_thighhighs, navel, official_alternate_costume, pelvic_curtain, short_sleeves, sitting, gold_trim, smile, folding_fan, holding_fan, simple_background |
| 2 | 24 |  |  |  |  |  | 1girl, scarf, solo, looking_at_viewer, black_sweater, brown_coat, hairclip, ribbed_sweater, bare_shoulders, off_shoulder, black_thighhighs, blush, cross_necklace, sweater_vest, alternate_costume, sleeveless_sweater, open_mouth, :d, casual |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_panties | cleavage | looking_at_viewer | navel | solo | black_thighhighs | chain | flower_ornament | sleeveless_dress | veil | cowboy_shot | blush | floating_hair | holding_staff | standing | blue_rose | open_mouth | :d | stomach | turret | china_dress | white_dress | center_opening | official_alternate_costume | pelvic_curtain | short_sleeves | sitting | gold_trim | smile | folding_fan | holding_fan | simple_background | scarf | black_sweater | brown_coat | hairclip | ribbed_sweater | off_shoulder | cross_necklace | sweater_vest | alternate_costume | sleeveless_sweater | casual |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:----------------|:-----------|:--------------------|:--------|:-------|:-------------------|:--------|:------------------|:-------------------|:-------|:--------------|:--------|:----------------|:----------------|:-----------|:------------|:-------------|:-----|:----------|:---------|:--------------|:--------------|:-----------------|:-----------------------------|:-----------------|:----------------|:----------|:------------|:--------|:--------------|:--------------|:--------------------|:--------|:----------------|:-------------|:-----------|:-----------------|:---------------|:-----------------|:---------------|:--------------------|:---------------------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | | X | X | X | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 24 |  |  |  |  |  | X | X | | | X | | X | X | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
NLPC-UOM/Sinhala-Stopword-list | ---
annotations_creators: []
language:
- si
license:
- mit
---
|
anan-2024/twitter_dataset_1713202903 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 148265
num_examples: 383
download_size: 82607
dataset_size: 148265
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Goorm-AI-04/Drone_Doppler_Noise | ---
dataset_info:
features:
- name: image
sequence:
sequence:
sequence: float64
- name: label
dtype: int64
- name: type
dtype: string
- name: noise_var_0.0001
sequence:
sequence:
sequence: float64
- name: noise_var_0.0005
sequence:
sequence:
sequence: float64
- name: noise_var_0.001
sequence:
sequence:
sequence: float64
- name: noise_var_0.005
sequence:
sequence:
sequence: float64
- name: noise_var_0.01
sequence:
sequence:
sequence: float64
splits:
- name: train
num_bytes: 395275453
num_examples: 3497
download_size: 314133140
dataset_size: 395275453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Drone_Doppler_Noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nandikaa08/lora_testing | ---
license: apache-2.0
---
|
leemeng/jcommonsenseqa-v1.1 | ---
license: cc-by-4.0
dataset_info:
features:
- name: q_id
dtype: int64
- name: question
dtype: string
- name: choice0
dtype: string
- name: choice1
dtype: string
- name: choice2
dtype: string
- name: choice3
dtype: string
- name: choice4
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1183829
num_examples: 8939
- name: validation
num_bytes: 148293
num_examples: 1119
download_size: 887894
dataset_size: 1332122
---
|
AlexBlck/ANAKIN | ---
license: cc-by-4.0
task_categories:
- video-classification
- visual-question-answering
language:
- en
pretty_name: 'ANAKIN: manipulated videos and mask annotations'
size_categories:
- 1K<n<10K
---
[arxiv](https://arxiv.org/abs/2303.13193)
# ANAKIN
ANAKIN is a dataset of mANipulated videos and mAsK annotatIoNs.
To our best knowledge, ANAKIN is the first real-world dataset of professionally edited video clips,
paired with source videos, edit descriptions and binary mask annotations of the edited regions.
ANAKIN consists of 1023 videos in total, including 352 edited videos from the
[VideoSham](https://github.com/adobe-research/VideoSham-dataset)
dataset plus 671 new videos collected from the Vimeo platform.
## Data Format
| Label | Description |
|----------|-------------------------------------------------------------------------------|
| video-id | Video ID |
|full* | Full length original video |
|trimmed | Short clip trimmed from `full` |
|edited| Manipulated version of `trimmed`|
|masks*| Per-frame binary masks, annotating the manipulation|
| start-time* | Trim beginning time (in seconds) |
| end-time* | Trim end time (in seconds) |
| task | Task given to the video editor |
|manipulation-type| One of the 5 manipulation types: splicing, inpainting, swap, audio, frame-level |
| editor-id | Editor ID |
*There are several subset configurations available.
The choice depends on whether you need to download full length videos and/or you only need the videos with masks available.
`start-time` and `end-time` will be returned for subset configs with full videos in them.
| config | full | masks | train/val/test |
| ---------- | ---- | ----- | -------------- |
| all | yes | maybe | 681/98/195 |
| no-full | no | maybe | 716/102/205 |
| has-masks | no | yes | 297/43/85 |
| full-masks | yes | yes | 297/43/85 |
## Example
The data can either be downloaded or [streamed](https://huggingface.co/docs/datasets/stream).
### Downloaded
```python
from datasets import load_dataset
from torchvision.io import read_video
config = 'no-full' # ['all', 'no-full', 'has-masks', 'full-masks']
dataset = load_dataset("AlexBlck/ANAKIN", config, nproc=8)
for sample in dataset['train']: # ['train', 'validation', 'test']
trimmed_video, trimmed_audio, _ = read_video(sample['trimmed'], output_format="TCHW")
edited_video, edited_audio, _ = read_video(sample['edited'], output_format="TCHW")
masks = sample['masks']
print(sample.keys())
```
### Streamed
```python
from datasets import load_dataset
import cv2
dataset = load_dataset("AlexBlck/ANAKIN", streaming=True)
sample = next(iter(dataset['train'])) # ['train', 'validation', 'test']
cap = cv2.VideoCapture(sample['trimmed'])
while(cap.isOpened()):
ret, frame = cap.read()
# ...
``` |
cmxuebuhui/subjectClassify | ---
license: apache-2.0
---
|
lipishan/ipa-phoneme-to-word | ---
license: other
---
|
freshpearYoon/vr_train_free_54 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6227657052
num_examples: 10000
download_size: 932462690
dataset_size: 6227657052
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
victor/autotrain-data-donut-vs-croissant | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: donut-vs-croissant
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project donut-vs-croissant.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<512x512 RGB PIL image>",
"target": 0
},
{
"image": "<512x512 RGB PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(num_classes=2, names=['croissant', 'donut'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 133 |
| valid | 362 |
|
Macromrit/ayurveda-text-based-qanda | ---
license: mit
task_categories:
- conversational
- text-generation
language:
- en
tags:
- ayurveda
- biology
- q_and_a
size_categories:
- 1K<n<10K
--- |
zoohun/dataset_low_test | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 251135
num_examples: 1023
download_size: 105571
dataset_size: 251135
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ademax/mozilla-vie-speech2text | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 351039045.6835097
num_examples: 14338
- name: test
num_bytes: 25069782.960490286
num_examples: 1000
download_size: 365566062
dataset_size: 376108828.644
---
# Dataset Card for "mozilla-vie-speech2text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chuyin0321/earnings-stocks | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: timestamp[ns, tz=EST]
- name: eps_estimate
dtype: float64
- name: reported_eps
dtype: float64
- name: surprise
dtype: float64
splits:
- name: train
num_bytes: 3707357
num_examples: 93309
download_size: 1828938
dataset_size: 3707357
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "earnings-stocks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TsumikiQAQ/testset | ---
license: apache-2.0
---
|
Paul/hatecheck | ---
annotations_creators:
- crowdsourced
language_creators:
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: HateCheck
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- hate-speech-detection
---
# Dataset Card for HateCheck
## Dataset Description
HateCheck is a suite of functional test for hate speech detection models.
The dataset contains 3,728 validated test cases in 29 functional tests.
19 functional tests correspond to distinct types of hate. The other 11 functional tests cover challenging types of non-hate.
This allows for targeted diagnostic insights into model performance.
In our ACL paper, we found critical weaknesses in all commercial and academic hate speech detection model that we tested with HateCheck.
Please refer to the paper (linked below) for results and further discussion, as well as further information on the dataset and a full data statement.
- **Paper:** Röttger et al. (2021) - HateCheck: Functional Tests for Hate Speech Detection Model. https://aclanthology.org/2021.acl-long.4/ or https://arxiv.org/abs/2012.15606
- **Repository:** https://github.com/paul-rottger/hatecheck-data
- **Point of Contact:** paul.rottger@oii.ox.ac.uk
## Dataset Structure
"test.csv" contains all 3,728 validated test cases. Each test case (row) has the following attributes:
**functionality**
The shorthand for the functionality tested by the test case.
**case_id**
The unique ID of the test case (assigned to each of the 3,901 cases we initially generated)
**test_case**
The text of the test case.
**label_gold**
The gold standard label (hateful/non-hateful) of the test case. All test cases within a given functionality have the same gold standard label.
**target_ident**
Where applicable, the protected group targeted or referenced by the test case. We cover seven protected groups in the test suite: women, trans people, gay people, black people, disabled people, Muslims and immigrants.
**direction**
For hateful cases, the binary secondary label indicating whether they are *directed* at an individual as part of a protected group or aimed at the group in *general*.
**focus_words**
Where applicable, the key word or phrase in a given test case (e.g. "cut their throats").
**focus_lemma**
Where applicable, the corresponding lemma (e.g. "cut sb. throat").
**ref_case_id**
For hateful cases, where applicable, the ID of the simpler hateful case which was perturbed to generate them.
For non-hateful cases, where applicable, the ID of the hateful case which is contrasted.
**ref_templ_id**
The equivalent, but for template IDs.
**templ_id**
The unique ID of the template from which the test case was generated (assigned to each of the 866 cases and templates from which we generated the 3,901 initial cases).
## Citation Information
When using HateCheck, please cite our ACL paper:
@inproceedings{rottger-etal-2021-hatecheck,
title = "{H}ate{C}heck: Functional Tests for Hate Speech Detection Models",
author = {R{\"o}ttger, Paul and
Vidgen, Bertie and
Nguyen, Dong and
Waseem, Zeerak and
Margetts, Helen and
Pierrehumbert, Janet},
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.4",
doi = "10.18653/v1/2021.acl-long.4",
pages = "41--58",
abstract = "Detecting online hate is a difficult task that even state-of-the-art models struggle with. Typically, hate speech detection models are evaluated by measuring their performance on held-out test data using metrics such as accuracy and F1 score. However, this approach makes it difficult to identify specific model weak points. It also risks overestimating generalisable model performance due to increasingly well-evidenced systematic gaps and biases in hate speech datasets. To enable more targeted diagnostic insights, we introduce HateCheck, a suite of functional tests for hate speech detection models. We specify 29 model functionalities motivated by a review of previous research and a series of interviews with civil society stakeholders. We craft test cases for each functionality and validate their quality through a structured annotation process. To illustrate HateCheck{'}s utility, we test near-state-of-the-art transformer models as well as two popular commercial models, revealing critical model weaknesses.",
}
|
vahn9995/booksum-stable-diffusion-prompt | ---
license: bsd-3-clause-clear
task_categories:
- summarization
language:
- en
pretty_name: Stable Diffusion Book Sum Prompts
size_categories:
- 1K<n<10K
---
This dataset was based off the https://huggingface.co/datasets/kmfoda/booksum dataset. It was made using GPT 3, inferencing it to create a multitude of stable-diffusion friendly prompts by passing in book chapters
from the booksum dataset. This can be utilized in large-language model fine-tuning or training to create a model that outputs stable diffusion prompts. |
benayas/atis_nlpaug_20pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 416146
num_examples: 4455
download_size: 177350
dataset_size: 416146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AravindVadlapudi02/UA_speech_multiclass | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': 0_control
'1': 1_very_low
'2': 2_low
'3': 3_mid
'4': 4_high
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 1910100348
num_examples: 1989
- name: test
num_bytes: 3457195200
num_examples: 3600
download_size: 619695502
dataset_size: 5367295548
---
# Dataset Card for "UA_speech_multiclass"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-34156b-59952145382 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: MohamedZaitoon/bart-fine-tune
metrics: ['rouge', 'accuracy', 'bleu', 'exact_match', 'f1', 'perplexity', 'recall', 'precision', 'roc_auc']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: MohamedZaitoon/bart-fine-tune
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sini raj p](https://huggingface.co/sini raj p) for evaluating this model. |
approximatelylinear/transformer-functions | ---
dataset_info:
features:
- name: signature
dtype: string
- name: body
dtype: string
- name: filename
dtype: string
splits:
- name: train
num_bytes: 17030109
num_examples: 16212
download_size: 5010779
dataset_size: 17030109
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "transformer-functions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ajibawa-2023__Code-13B | ---
pretty_name: Evaluation run of ajibawa-2023/Code-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ajibawa-2023/Code-13B](https://huggingface.co/ajibawa-2023/Code-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Code-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T19:40:16.694610](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-13B/blob/main/results_2023-12-09T19-40-16.694610.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5315302469691541,\n\
\ \"acc_stderr\": 0.0338171547995471,\n \"acc_norm\": 0.5374650243523146,\n\
\ \"acc_norm_stderr\": 0.034550805778528454,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.01598359510181139,\n \"mc2\": 0.4246156253859874,\n\
\ \"mc2_stderr\": 0.01586771249517698\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097665,\n\
\ \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6420035849432384,\n\
\ \"acc_stderr\": 0.004784312972495391,\n \"acc_norm\": 0.8328022306313483,\n\
\ \"acc_norm_stderr\": 0.003723897305645496\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n\
\ \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n\
\ \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933882,\n\
\ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933882\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6844036697247706,\n\
\ \"acc_stderr\": 0.019926117513869666,\n \"acc_norm\": 0.6844036697247706,\n\
\ \"acc_norm_stderr\": 0.019926117513869666\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n\
\ \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7215836526181354,\n\
\ \"acc_stderr\": 0.016028295188992476,\n \"acc_norm\": 0.7215836526181354,\n\
\ \"acc_norm_stderr\": 0.016028295188992476\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.01520103251252044,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.01520103251252044\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871588,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871588\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778862,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n\
\ \"acc_stderr\": 0.012555701346703385,\n \"acc_norm\": 0.408735332464146,\n\
\ \"acc_norm_stderr\": 0.012555701346703385\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5179738562091504,\n \"acc_stderr\": 0.020214761037872404,\n \
\ \"acc_norm\": 0.5179738562091504,\n \"acc_norm_stderr\": 0.020214761037872404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894843,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894843\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.01598359510181139,\n \"mc2\": 0.4246156253859874,\n\
\ \"mc2_stderr\": 0.01586771249517698\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19029567854435178,\n \
\ \"acc_stderr\": 0.010812347283182974\n }\n}\n```"
repo_url: https://huggingface.co/ajibawa-2023/Code-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-40-16.694610.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- '**/details_harness|winogrande|5_2023-12-09T19-40-16.694610.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T19-40-16.694610.parquet'
- config_name: results
data_files:
- split: 2023_12_09T19_40_16.694610
path:
- results_2023-12-09T19-40-16.694610.parquet
- split: latest
path:
- results_2023-12-09T19-40-16.694610.parquet
---
# Dataset Card for Evaluation run of ajibawa-2023/Code-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ajibawa-2023/Code-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ajibawa-2023/Code-13B](https://huggingface.co/ajibawa-2023/Code-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Code-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:40:16.694610](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-13B/blob/main/results_2023-12-09T19-40-16.694610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5315302469691541,
"acc_stderr": 0.0338171547995471,
"acc_norm": 0.5374650243523146,
"acc_norm_stderr": 0.034550805778528454,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.01598359510181139,
"mc2": 0.4246156253859874,
"mc2_stderr": 0.01586771249517698
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097665,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.6420035849432384,
"acc_stderr": 0.004784312972495391,
"acc_norm": 0.8328022306313483,
"acc_norm_stderr": 0.003723897305645496
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.025310639254933882,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.025310639254933882
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6844036697247706,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.6844036697247706,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374983,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374983
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196697,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7215836526181354,
"acc_stderr": 0.016028295188992476,
"acc_norm": 0.7215836526181354,
"acc_norm_stderr": 0.016028295188992476
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.01520103251252044,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.01520103251252044
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011998,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871588,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778862,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.012555701346703385,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.012555701346703385
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5179738562091504,
"acc_stderr": 0.020214761037872404,
"acc_norm": 0.5179738562091504,
"acc_norm_stderr": 0.020214761037872404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.031001209039894843,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.031001209039894843
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.01598359510181139,
"mc2": 0.4246156253859874,
"mc2_stderr": 0.01586771249517698
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
},
"harness|gsm8k|5": {
"acc": 0.19029567854435178,
"acc_stderr": 0.010812347283182974
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KyonBS/HigiriKunoichiTsubaki | ---
license: openrail
---
|
autoevaluate/autoeval-staging-eval-project-02148524-0081-4ca2-963d-7e44c726ec75-1311 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
wanyu/IteraTeR_full_sent | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
pretty_name: IteraTeR_full_sent
language_bcp47:
- en-US
tags:
- conditional-text-generation
- text-editing
---
Paper: [Understanding Iterative Revision from Human-Written Text](https://arxiv.org/abs/2203.03802)
Authors: Wanyu Du, Vipul Raheja, Dhruv Kumar, Zae Myung Kim, Melissa Lopez, Dongyeop Kang
Github repo: https://github.com/vipulraheja/IteraTeR
|
croyer/MIMIC-III-split | ---
license: mit
language:
- en
tags:
- medical
pretty_name: MIMIC-III split
size_categories:
- 10K<n<100K
configs:
- config_name: default
data_files:
- split: train
path: idsTrain.csv
- split: test
path: idsTest.csv
- split: validation
path: idsValidation.csv
--- |
aisc-team-b1/PMC-CaseReport-Finetuning | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 591479454
num_examples: 316838
- name: test
num_bytes: 236815962
num_examples: 120836
download_size: 158304015
dataset_size: 828295416
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kenthug/kusakanmuri | ---
license: afl-3.0
---
|
ripanroy/replace-anything | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1577364.0
num_examples: 10
download_size: 1570992
dataset_size: 1577364.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/fairness_chef_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: scores
sequence: float64
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 2513915
num_examples: 4800
download_size: 238232
dataset_size: 2513915
---
# Dataset Card for "fairness_chef_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/komuro_chinami_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of komuro_chinami/小室千奈美 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of komuro_chinami/小室千奈美 (THE iDOLM@STER: Cinderella Girls), containing 22 images and their tags.
The core tags of this character are `long_hair, brown_hair, brown_eyes, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 13.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komuro_chinami_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 13.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komuro_chinami_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 19.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komuro_chinami_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 13.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komuro_chinami_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 19.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komuro_chinami_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/komuro_chinami_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, smile, cleavage, jewelry |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | cleavage | jewelry |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------|:----------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X |
|
the-french-artist/wikipedia_20220301.simple_sentence_split | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: sentence_index
dtype: int64
- name: line_index
dtype: int64
splits:
- name: train
num_bytes: 636951662
num_examples: 3942576
download_size: 160914787
dataset_size: 636951662
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.