datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
arminmrm93/free_recipe_no_embedding | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2219640
num_examples: 2389
download_size: 1116654
dataset_size: 2219640
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "free_recipe_no_embedding"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KATANABRAVE/stories | ---
license: llama2
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 110879624
num_examples: 8500
- name: validation
num_bytes: 3383807
num_examples: 277
download_size: 48437278
dataset_size: 114263431
---
|
atitaarora/qdrant_doc | ---
language:
- en
license: apache-2.0
---
|
LHF/escorpius | ---
license: cc-by-nc-nd-4.0
language:
- es
multilinguality:
- monolingual
size_categories:
- 100M<n<1B
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
---
# esCorpius: A Massive Spanish Crawling Corpus
## Introduction
In the recent years, Transformer-based models have lead to significant advances in language modelling for natural language processing. However, they require a vast amount of data to be (pre-)trained and there is a lack of corpora in languages other than English. Recently, several initiatives have presented multilingual datasets obtained from automatic web crawling. However, the results in Spanish present important shortcomings, as they are either too small in comparison with other languages, or present a low quality derived from sub-optimal cleaning and deduplication. In this work, we introduce esCorpius, a Spanish crawling corpus obtained from near 1 Pb of Common Crawl data. It is the most extensive corpus in Spanish with this level of quality in the extraction, purification and deduplication of web textual content. Our data curation process involves a novel highly parallel cleaning pipeline and encompasses a series of deduplication mechanisms that together ensure the integrity of both document and paragraph boundaries. Additionally, we maintain both the source web page URL and the WARC shard origin URL in order to complain with EU regulations. esCorpius has been released under CC BY-NC-ND 4.0 license.
## Statistics
| **Corpus** | OSCAR<br>22.01 | mC4 | CC-100 | ParaCrawl<br>v9 | esCorpius<br>(ours) |
|-------------------------|----------------|--------------|-----------------|-----------------|-------------------------|
| **Size (ES)** | 381.9 GB | 1,600.0 GB | 53.3 GB | 24.0 GB | 322.5 GB |
| **Docs (ES)** | 51M | 416M | - | - | 104M |
| **Words (ES)** | 42,829M | 433,000M | 9,374M | 4,374M | 50,773M |
| **Lang.<br>identifier** | fastText | CLD3 | fastText | CLD2 | CLD2 + fastText |
| **Elements** | Document | Document | Document | Sentence | Document and paragraph |
| **Parsing quality** | Medium | Low | Medium | High | High |
| **Cleaning quality** | Low | No cleaning | Low | High | High |
| **Deduplication** | No | No | No | Bicleaner | dLHF |
| **Language** | Multilingual | Multilingual | Multilingual | Multilingual | Spanish |
| **License** | CC-BY-4.0 | ODC-By-v1.0 | Common<br>Crawl | CC0 | CC-BY-NC-ND |
## Citation
Link to the paper: https://www.isca-speech.org/archive/pdfs/iberspeech_2022/gutierrezfandino22_iberspeech.pdf / https://arxiv.org/abs/2206.15147
Cite this work:
```
@inproceedings{gutierrezfandino22_iberspeech,
author={Asier Gutiérrez-Fandiño and David Pérez-Fernández and Jordi Armengol-Estapé and David Griol and Zoraida Callejas},
title={{esCorpius: A Massive Spanish Crawling Corpus}},
year=2022,
booktitle={Proc. IberSPEECH 2022},
pages={126--130},
doi={10.21437/IberSPEECH.2022-26}
}
```
## Disclaimer
We did not perform any kind of filtering and/or censorship to the corpus. We expect users to do so applying their own methods. We are not liable for any misuse of the corpus. |
jiangyige/ParaphrasedEnglishSentencePairs | ---
license: unknown
---
|
claudios/dypybench_functions | ---
license: cc-by-4.0
pretty_name: DyPyBench Functions
tags:
- code
dataset_info:
features:
- name: nwo
dtype: string
- name: sha
dtype: string
- name: path
dtype: string
- name: identifier
dtype: string
- name: parameters
dtype: string
- name: return_statement
dtype: string
- name: docstring
dtype: string
- name: docstring_summary
dtype: string
- name: func_begin
dtype: int64
- name: func_end
dtype: int64
- name: function
dtype: string
- name: url
dtype: string
- name: project
dtype: int64
- name: executed_lines
sequence: int64
- name: executed_lines_pc
dtype: float64
- name: missing_lines
sequence: int64
- name: missing_lines_pc
dtype: float64
- name: covered
dtype: bool
- name: filecoverage
dtype: float64
- name: function_lines
dtype: int64
- name: mccabe
dtype: int64
- name: coverage
dtype: float64
- name: docstring_lines
dtype: int64
- name: function_nodoc
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 22383711
num_examples: 11168
download_size: 6805239
dataset_size: 22383711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# DyPyBench Functions Datasets
[DyPyBench](https://zenodo.org/record/7886366) is a dataset constructed by Piyush Krishan Bajaj at the Software Lab, Institute of Software Engineering, University of Stuttgart. It contains 50 open source projects from GitHub.
We used [Nathan Cooper's](https://github.com/ncoop57/function_parser) `function_parser` tool, based off GitHub's CodeSearchNet `function_parser`, to extract all functions from all the projects, excluding library functions in the virtualenv. We also ran all tests in DyPyBench and produced a coverage report in JSON. Not all projects resulted in a coverage report due to project specific coverage report settings.
The columns provided are as follows:
| Column | Type | Notes |
| ----------------- | ---------- | ----------------------------------------------------------------------------------------------- |
| id | Int64 | Unique id of the function
| project | Int64 | DyPyBench project id |
| nwo | string | Project name in repo/project format |
| sha | string | SHA commit hash |
| url | string | GitHub URL to function lines at commit |
| path | string | Path of file containing function relative to project root |
| func_begin | Int64 | Begin of function line number in source file |
| func_end | Int64 | End of function line number in source file |
| function_lines | Int64 | Function line count |
| identifier | string | Function identifier |
| parameters | string | Function parameters |
| function | string | Source code of function including docstring |
| function_nodoc | string | Source code of function without docstring |
| docstring | string | Function docstring |
| docstring_lines | Int64 | Line count of docstring |
| docstring_summary | string | Function docstring summary |
| return_statement | string | Function return statement |
| filecoverage | Float64 | If coverage available, coverage percentage of file function is from |
| executed_lines | array[int] | If coverage available, executed lines relative to function lines (i.e. [0,1,2,...]) |
| executed_lines_pc | Float64 | If coverage available, executed line count over total function line count |
| missing_lines | array[int] | If coverage available, missing (unexecuted) lines relative to function lines (i.e. [0,1,2,...]) |
| missing_lines_pc | Float64 | If coverage available, missing line count over total function line count |
| covered | boolean | True if all lines executed and/or no lines missing |
| mccabe | Int64 | McCabe complexity of function |
| coverage | Float64 | Function coverage percentage (1-missing lines %) |
Note: Missing/executed lines purposefully exclude lines skipped by `pytest` due to configuration e.g. line level `# pragma: no cover`.
|
arfu/emr_info_extract | ---
dataset_info:
features:
- name: id
dtype: int64
- name: dialogue
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 1842245
num_examples: 2019
- name: test
num_bytes: 123971
num_examples: 120
download_size: 440433
dataset_size: 1966216
---
# Dataset Card for "emr_info_extract"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gaborcselle/font-examples | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AlfaSlabOne-Regular
'1': ArchitectsDaughter-Regular
'2': Arial
'3': Arial Black
'4': Arial Bold
'5': Arial Bold Italic
'6': Avenir
'7': Bangers-Regular
'8': BlackOpsOne-Regular
'9': Courier
'10': Georgia
'11': Helvetica
'12': IBMPlexSans-Regular
'13': Inter-Regular
'14': KaushanScript-Regular
'15': Lato-Regular
'16': Lobster-Regular
'17': Lora-Regular
'18': Merriweather-Regular
'19': Niconne-Regular
'20': OpenSans-Bold
'21': OpenSans-Italic
'22': OpenSans-Light
'23': Pacifico-Regular
'24': PixelifySans-Regular
'25': PlayfairDisplay-Regular
'26': Poppins-Regular
'27': Rakkas-Regular
'28': Roboto-Regular
'29': RobotoMono-Regular
'30': RobotoSlab-Regular
'31': Rubik-Regular
'32': SpaceMono-Regular
'33': Tahoma
'34': Tahoma Bold
'35': Times New Roman
'36': Times New Roman Bold
'37': Times New Roman Bold Italic
'38': Times New Roman Italic
'39': TitilliumWeb-Regular
'40': Trebuchet MS
'41': Trebuchet MS Bold
'42': Trebuchet MS Bold Italic
'43': Trebuchet MS Italic
'44': Verdana
'45': Verdana Bold
'46': Verdana Bold Italic
'47': Verdana Italic
splits:
- name: train
num_bytes: 108384385.6
num_examples: 2400
download_size: 104995129
dataset_size: 108384385.6
---
# Dataset Card for "font-examples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HHazard/autotrain-data-llama2 | ---
license: apache-2.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/98fc77eb | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1341
dataset_size: 188
---
# Dataset Card for "98fc77eb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cfrerebeau/picto2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: data
num_bytes: 11970350.0
num_examples: 48
download_size: 11612085
dataset_size: 11970350.0
---
# Dataset Card for "picto2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tural/wiki-unzh | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20277571711
num_examples: 6458670
download_size: 11689463675
dataset_size: 20277571711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wiki-unzh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shredder-31/Min1_Sum_SummarizationData | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9591433
num_examples: 500
- name: dev
num_bytes: 944676
num_examples: 50
- name: test
num_bytes: 952916
num_examples: 50
download_size: 5223818
dataset_size: 11489025
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
Thanmay/boolq-ta | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: bool
- name: passage
dtype: string
- name: itv2 ta question
dtype: string
- name: itv2 ta passage
dtype: string
splits:
- name: validation
num_bytes: 7707383
num_examples: 3270
download_size: 3265174
dataset_size: 7707383
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
ChangeIsKey/open-riksdag | ---
language:
- sv
license: cc-by-4.0
size_categories:
- 1M<n<10M
pretty_name: Open Riksdag-103
tags:
- diachronic
- semantic change
---
This is a dataset of text from the Riksdag, Sweden's national legislative body.
The original data is availble without a license under the Re-use of Public Administration Documents Act (2010:566) at https://data.riksdagen.se/data/dokument
This dataset is derivative of a version compiled by Språkbanken Text (SBX) at the University of Gothenburg (Sweden). That version consists of XML files split by source document type (motions, questions, protocol, etc.) and includes additional linguistic annotations. It is available under a CC BY 4.0 license at https://spraakbanken.gu.se/resurser/rd
The focus of this huggingface dataset is to organise the data for fine-grained diachronic modeling. In a nutshell, this version offers:
- all sentences including one or more of 103 target words, which were chosen by TF-IDF (described below)
- per-month subsets (with all document types combined)
- one line per sentence (sentences shorter than 4 words were discarded)
- data includes: date, document_type, document_id, target_word, and text.
The dataset builder requires a `years` argument, which must be an interable of years between 1979 and 2019 (inclusive). This can be supplied to the `load_dataset` function as a keyword argument.
For example, to load raw sentences from the `prop` and `bet` data sources run:
```python
from datasets import load_dataset
data = load_dataset('ChangeIsKey/open-riksdag', 'sentences' years=range(1999,2000), sources=['prop', 'bet'])
```
License is CC BY 4.0 with attribution.
|
iotengtr/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
dtype: int64
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
- name: handling_time
dtype: float64
splits:
- name: train
num_bytes: 16667996
num_examples: 5575
download_size: 3986763
dataset_size: 16667996
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
valurank/Explicit_content | ---
license: other
task_categories:
- text-classification
size_categories:
- 1K<n<10K
---
---
license:
- other
language:
- en
multilinguality:
- monolingual
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# Dataset Card for Explicit content detection
## Table of Contents
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Source Data](#source-data)
## Dataset Description
1189 News Articles classified into different categories namely: "Explicit" if the article contains explicit content and "Not_Explicit" if not.
## Languages
The text in the dataset is in English
## Dataset Structure
The dataset consists of two columns namely Article and Category.
The Article column consists of the news article and the Category column consists of the class each article belongs to wether it contains explicit content or not
## Source Data
The dataset is queried from the Otherweb database |
mbruton/galician_srl | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: tags
sequence:
class_label:
names:
'0': O
'1': r0:arg0
'2': r0:arg1
'3': r0:arg2
'4': r0:root
'5': r10:arg0
'6': r10:arg1
'7': r10:root
'8': r11:arg0
'9': r11:root
'10': r12:arg1
'11': r12:root
'12': r13:arg1
'13': r13:root
'14': r1:arg0
'15': r1:arg1
'16': r1:arg2
'17': r1:root
'18': r2:arg0
'19': r2:arg1
'20': r2:arg2
'21': r2:root
'22': r3:arg0
'23': r3:arg1
'24': r3:arg2
'25': r3:root
'26': r4:arg0
'27': r4:arg1
'28': r4:arg2
'29': r4:root
'30': r5:arg0
'31': r5:arg1
'32': r5:arg2
'33': r5:root
'34': r6:arg0
'35': r6:arg1
'36': r6:arg2
'37': r6:root
'38': r7:arg0
'39': r7:arg1
'40': r7:arg2
'41': r7:root
'42': r8:arg0
'43': r8:arg1
'44': r8:arg2
'45': r8:root
'46': r9:arg0
'47': r9:arg1
'48': r9:arg2
'49': r9:root
- name: ids
dtype: int64
splits:
- name: train
num_bytes: 2241310
num_examples: 3986
- name: test
num_bytes: 555760
num_examples: 997
download_size: 675236
dataset_size: 2797070
license: apache-2.0
task_categories:
- token-classification
language:
- gl
pretty_name: GalicianSRL
size_categories:
- 1K<n<10K
---
# Dataset Card for GalicianSRL
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Limitations](#limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Citation Information](#citation-information)
## Dataset Description
- **Repository:** [GalicianSRL Project Hub](https://github.com/mbruton0426/GalicianSRL)
- **Paper:** To be updated
- **Point of Contact:** [Micaella Bruton](mailto:micaellabruton@gmail.com)
### Dataset Summary
The GalicianSRL dataset is a Galician-language dataset of tokenized sentences and the semantic role for each token within a sentence. Semantic roles are limited to verbal roots, argument 0, argument 1, and argument 2. This dataset was created to support the task of semantic role labeling in the Galician language, as no publically available datasets existed as of the date of publication to the contributor's knowledge.
### Languages
The text in the dataset is in Galician.
## Dataset Structure
### Data Instances
A typical data point comprises a tokenized sentence, tags for each token, and a sentence id number. An example from the GalicianSRL dataset looks as follows:
```
{'tokens': ['O', 'Pleno', 'poderá', ',', 'con', 'todo', ',', 'avocar', 'en', 'calquera', 'momento', 'o', 'debate', 'e', 'votación', 'de', 'calquera', 'proxecto', 'ou', 'proposición', 'de', 'lei', 'que', 'xa', 'fora', 'obxecto', 'de', 'esta', 'delegación', '.'],
'tags': [0, 1, 4, 0, 0, 0, 0, 17, 0, 0, 16, 0, 15, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
'ids': 504}
```
Tags are assigned an id number according to the index of its label as listed in:
```python
>>> dataset['train'].features['tags'].feature.names
```
### Data Fields
- `tokens`: a list of strings
- `tags`: a list of integers
- `ids`: a sentence id, as an integer
### Data Splits
The data is split into a training and test set. The final structure and split sizes are as follow:
```
DatasetDict({
train: Dataset({
features: ['tokens', 'tags', 'ids'],
num_rows: 1005
})
test: Dataset({
features: ['tokens', 'tags', 'ids'],
num_rows: 252
})
})
```
## Dataset Creation
### Curation Rationale
GalicianSRL was built to provide a dataset for semantic role labeling in Galician and expand NLP resources available for the Galician language.
### Source Data
#### Initial Data Collection and Normalization
Data was collected from both the [CTG UD annotated corpus](https://github.com/UniversalDependencies/UD_Galician-CTG) and the [TreeGal UD annotated corpus](https://github.com/UniversalDependencies/UD_Galician-TreeGal), and combined to collect the requsite information for this task. For more information, please refer to the publication listed in the citation.
## Considerations for Using the Data
### Limitations
The purpose of this dataset is to help develop a working semantic role labeling system for Galician, as SRL systems have been shown to improve a variety of NLP tasks. It should be noted however that Galician is considered a low-resource language at this time, and as such the dataset has an extrememly limited scope. This dataset would benefit from manual validation of a native speaker of Galician, the inclusion of additional sentences, and an extention of arguments past arg0, arg1, and arg2.
## Additional Information
### Dataset Curators
The dataset was created by Micaella Bruton, as part of her Master's thesis.
### Citation Information
```
@inproceedings{bruton-beloucif-2023-bertie,
title = "{BERT}ie Bott{'}s Every Flavor Labels: A Tasty Introduction to Semantic Role Labeling for {G}alician",
author = "Bruton, Micaella and
Beloucif, Meriem",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.671",
doi = "10.18653/v1/2023.emnlp-main.671",
pages = "10892--10902",
abstract = "In this paper, we leverage existing corpora, WordNet, and dependency parsing to build the first Galician dataset for training semantic role labeling systems in an effort to expand available NLP resources. Additionally, we introduce verb indexing, a new pre-processing method, which helps increase the performance when semantically parsing highly-complex sentences. We use transfer-learning to test both the resource and the verb indexing method. Our results show that the effects of verb indexing were amplified in scenarios where the model was both pre-trained and fine-tuned on datasets utilizing the method, but improvements are also noticeable when only used during fine-tuning. The best-performing Galician SRL model achieved an f1 score of 0.74, introducing a baseline for future Galician SRL systems. We also tested our method on Spanish where we achieved an f1 score of 0.83, outperforming the baseline set by the 2009 CoNLL Shared Task by 0.025 showing the merits of our verb indexing method for pre-processing.",
}
``` |
jaystoneshi/mus | ---
license: apache-2.0
---
|
ZharfaTech/ZharfaTech-Open-Platypus-Persian-Farsi | ---
license: cc-by-4.0
task_categories:
- text-generation
- text2text-generation
- summarization
- question-answering
language:
- fa
tags:
- code
- reasoning
- math
pretty_name: ZharfaTech Open-Platypus Persian (Farsi)
size_categories:
- 10K<n<100K
---
# Persian Open-Platypus
## About ZharfaTech
ZharfaTech is a pioneer in developing Language Learning Models (LLMs) tailored for the Persian language, aiming to empower over 100 million Persian speakers worldwide. Our mission encompasses bridging the digital divide in LLM-related services like content generation, customer relationship systems, and more, with a dual approach of fostering open-source collaboration and delivering high-value, specialized closed-source solutions.
## Dataset Overview
The Persian Open-Platypus dataset is a comprehensive Persian translation of the ["Open-Platypus" dataset](https://huggingface.co/datasets/garage-bAInd/Open-Platypus), originally aimed at enhancing LLM logical reasoning skills. This translation is part of our initiative to create high-quality resources for Persian LLM development, using a high-performance local translation model. The translation process was accomplished in 20 hours on 3 Nvidia GPUs.
### Key Features:
- **Language:** Persian
- **Source:** Translated from "Open-Platypus"
- **Translation Method:** Local transitional model
- **Processing Time:** 20 hours on 3 Nvidia GPUs
### Included Datasets:
The original Open-Platypus dataset comprises several datasets, all aimed at logical reasoning enhancement. They include PRM800K, MATH, ScienceQA, SciBench, ReClor, TheoremQA, and more, filtered to maintain uniqueness and relevance. Our Persian translation adheres to these selections.
## Objective and Scope
At ZharfaTech, we aim to enhance Persian LLM technology capabilities through:
- Fine-tuning open-source models for Persian language understanding.
- Creating specialized datasets to support comprehensive model training.
- Developing advanced closed-source models for specific industry needs.
Our work strives to democratize LLM technology for Persian speakers, fostering community collaboration and innovation.
## Contributions
We welcome community contributions to refine and expand this dataset. For suggestions or enhancements.
## Acknowledgments
Our sincere thanks to the creators of the original "Open-Platypus" dataset and all contributors to the datasets included therein. Special appreciation goes to our team members who skillfully managed the translation, ensuring the dataset's quality and relevance to the Persian language.
## License
This dataset is released under various cc-by-4.0, consistent with the original dataset's licensing terms.
## Contact Us
For more information about our work or this dataset, please contact ZharfaTech at [https://zharfa.tech](https://zharfa.tech).
---
Empowering the Persian language community with advanced LLM technologies - ZharfaTech. |
mask-distilled-one-sec-cv12/chunk_142 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1070348584
num_examples: 210202
download_size: 1092140988
dataset_size: 1070348584
---
# Dataset Card for "chunk_142"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
c4ba/bielmc | ---
license: openrail
---
|
Adel-Elwan/Artificial-intelligence-dataset-for-IR-systems | ---
task_categories:
- question-answering
language:
- en
tags:
- artificial-intelligence
- Information-Retrieval
pretty_name: Information Retrieval dataset in the AI domain
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
- information-retrieval
- semantic-search
### Languages
- English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Qwen__Qwen-72B | ---
pretty_name: Evaluation run of Qwen/Qwen-72B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Qwen/Qwen-72B](https://huggingface.co/Qwen/Qwen-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 62 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen-72B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-05T02:10:37.267059](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen-72B/blob/main/results_2023-12-05T02-10-37.267059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7692238060042156,\n\
\ \"acc_stderr\": 0.027807291244956196,\n \"acc_norm\": 0.7731238892784332,\n\
\ \"acc_norm_stderr\": 0.028330728981592973,\n \"mc1\": 0.42717258261933905,\n\
\ \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.6019109516805667,\n\
\ \"mc2_stderr\": 0.014606562783785249\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.0141696645203031,\n\
\ \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.01392100859517935\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6684923322047401,\n\
\ \"acc_stderr\": 0.004697929774670292,\n \"acc_norm\": 0.8593905596494722,\n\
\ \"acc_norm_stderr\": 0.0034690778470563865\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930394,\n\
\ \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930394\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.022691482872035342,\n\
\ \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.022691482872035342\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n\
\ \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \
\ \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.65,\n \"acc_stderr\": 0.047937248544110175,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110175\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7803468208092486,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.7803468208092486,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n\
\ \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n\
\ \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.026754391348039766,\n\
\ \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.026754391348039766\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747549,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747549\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6878306878306878,\n \"acc_stderr\": 0.02386520683697258,\n \"\
acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.02386520683697258\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n\
\ \"acc_stderr\": 0.017545102951656632,\n \"acc_norm\": 0.8935483870967742,\n\
\ \"acc_norm_stderr\": 0.017545102951656632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280459,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280459\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.017646526677233317,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.017646526677233317\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n\
\ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588768,\n\
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588768\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4962962962962963,\n \"acc_stderr\": 0.03048470166508437,\n \
\ \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.03048470166508437\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\
\ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5695364238410596,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.5695364238410596,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9284403669724771,\n \"acc_stderr\": 0.011051255247815476,\n \"\
acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.011051255247815476\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6990740740740741,\n \"acc_stderr\": 0.03128039084329883,\n \"\
acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.03128039084329883\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9362745098039216,\n \"acc_stderr\": 0.01714392165552496,\n \"\
acc_norm\": 0.9362745098039216,\n \"acc_norm_stderr\": 0.01714392165552496\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065505,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065505\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8251121076233184,\n\
\ \"acc_stderr\": 0.025495284626444965,\n \"acc_norm\": 0.8251121076233184,\n\
\ \"acc_norm_stderr\": 0.025495284626444965\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147735,\n\
\ \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147735\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n\
\ \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331362,\n\
\ \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331362\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n\
\ \"acc_stderr\": 0.01445018117687274,\n \"acc_norm\": 0.9487179487179487,\n\
\ \"acc_norm_stderr\": 0.01445018117687274\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n\
\ \"acc_stderr\": 0.009866287394639536,\n \"acc_norm\": 0.9169859514687101,\n\
\ \"acc_norm_stderr\": 0.009866287394639536\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n\
\ \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6435754189944134,\n\
\ \"acc_stderr\": 0.016018239710513398,\n \"acc_norm\": 0.6435754189944134,\n\
\ \"acc_norm_stderr\": 0.016018239710513398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.020464175124332632,\n\
\ \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.020464175124332632\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n\
\ \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n\
\ \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6524822695035462,\n \"acc_stderr\": 0.028406627809590954,\n \
\ \"acc_norm\": 0.6524822695035462,\n \"acc_norm_stderr\": 0.028406627809590954\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6127770534550195,\n\
\ \"acc_stderr\": 0.012441155326854931,\n \"acc_norm\": 0.6127770534550195,\n\
\ \"acc_norm_stderr\": 0.012441155326854931\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8455882352941176,\n \"acc_stderr\": 0.021950024722922033,\n\
\ \"acc_norm\": 0.8455882352941176,\n \"acc_norm_stderr\": 0.021950024722922033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262552,\n \
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262552\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759033,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759033\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n\
\ \"mc1_stderr\": 0.017316834410963933,\n \"mc2\": 0.6019109516805667,\n\
\ \"mc2_stderr\": 0.014606562783785249\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706177\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \
\ \"acc_stderr\": 0.012570068947898772\n }\n}\n```"
repo_url: https://huggingface.co/Qwen/Qwen-72B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|arc:challenge|25_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|gsm8k|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hellaswag|10_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T02-10-37.267059.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_05T02_10_37.267059
path:
- '**/details_harness|winogrande|5_2023-12-05T02-10-37.267059.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-05T02-10-37.267059.parquet'
---
# Dataset Card for Evaluation run of Qwen/Qwen-72B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Qwen/Qwen-72B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Qwen/Qwen-72B](https://huggingface.co/Qwen/Qwen-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 62 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen-72B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T02:10:37.267059](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen-72B/blob/main/results_2023-12-05T02-10-37.267059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7692238060042156,
"acc_stderr": 0.027807291244956196,
"acc_norm": 0.7731238892784332,
"acc_norm_stderr": 0.028330728981592973,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963933,
"mc2": 0.6019109516805667,
"mc2_stderr": 0.014606562783785249
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.0141696645203031,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.01392100859517935
},
"harness|hellaswag|10": {
"acc": 0.6684923322047401,
"acc_stderr": 0.004697929774670292,
"acc_norm": 0.8593905596494722,
"acc_norm_stderr": 0.0034690778470563865
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930394,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930394
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.022691482872035342,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.022691482872035342
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.026754391348039766,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.026754391348039766
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6878306878306878,
"acc_stderr": 0.02386520683697258,
"acc_norm": 0.6878306878306878,
"acc_norm_stderr": 0.02386520683697258
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8935483870967742,
"acc_stderr": 0.017545102951656632,
"acc_norm": 0.8935483870967742,
"acc_norm_stderr": 0.017545102951656632
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280459,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280459
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.017646526677233317,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.017646526677233317
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588768,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588768
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.03048470166508437,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.03048470166508437
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5695364238410596,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.5695364238410596,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9284403669724771,
"acc_stderr": 0.011051255247815476,
"acc_norm": 0.9284403669724771,
"acc_norm_stderr": 0.011051255247815476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6990740740740741,
"acc_stderr": 0.03128039084329883,
"acc_norm": 0.6990740740740741,
"acc_norm_stderr": 0.03128039084329883
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9362745098039216,
"acc_stderr": 0.01714392165552496,
"acc_norm": 0.9362745098039216,
"acc_norm_stderr": 0.01714392165552496
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065505,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065505
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8251121076233184,
"acc_stderr": 0.025495284626444965,
"acc_norm": 0.8251121076233184,
"acc_norm_stderr": 0.025495284626444965
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.02622223517147735,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.02622223517147735
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331362,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331362
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.01445018117687274,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.01445018117687274
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9169859514687101,
"acc_stderr": 0.009866287394639536,
"acc_norm": 0.9169859514687101,
"acc_norm_stderr": 0.009866287394639536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6435754189944134,
"acc_stderr": 0.016018239710513398,
"acc_norm": 0.6435754189944134,
"acc_norm_stderr": 0.016018239710513398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.020464175124332632,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.020464175124332632
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.021029576464662695,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.021029576464662695
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571842,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6524822695035462,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.6524822695035462,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6127770534550195,
"acc_stderr": 0.012441155326854931,
"acc_norm": 0.6127770534550195,
"acc_norm_stderr": 0.012441155326854931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8455882352941176,
"acc_stderr": 0.021950024722922033,
"acc_norm": 0.8455882352941176,
"acc_norm_stderr": 0.021950024722922033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262552,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262552
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759033,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759033
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963933,
"mc2": 0.6019109516805667,
"mc2_stderr": 0.014606562783785249
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.010684179227706177
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898772
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kensho/spgispeech_demo | ---
annotations_creators:
- expert-generated
language_creators:
- found
languages:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: SpgiSpeech
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
task_ids: []
extra_gated_prompt: |-
Your access to and use of the information in the Kensho Transcript Dataset (the “Content”), which is provided by Kensho Technologies, LLC, a subsidiary of S&P Global, Inc., (“Kensho”), shall be governed by the following terms and conditions of usage (“Terms of Usage”). The Content may be accessed only by persons who have been authorized to use this Content pursuant to their acceptance and acknowledgement of these Terms of Usage (in each case, an “Authorized User”). By providing your electronic signature at the end of these Terms of Usage, you represent that you are an Authorized User and that you accept these Terms of Usage and agree to be bound by them.
If you do not wish to be bound by these Terms of Usage, you must not use this Content. PLEASE READ THESE TERMS OF USAGE CAREFULLY BEFORE USING THIS CONTENT.
Section 1 – THE CONTENT
1.1 The Content is provided for academic research purposes and internal use only and must not be used to: assemble or create a database; construct or facilitate the construction of products which compete with the Content; identify or attempt to identify or contact any individual; or link to another dataset.
The Content, which is comprised of public earnings calls in audio and corresponding text format, and all accompanying derived products is proprietary to Kensho and its third-party content providers. You shall not modify the Content; create derivative works based on the Content, rewrite or reprocess the Content except as expressly provided herein. You must not publish, display, transfer or redistribute the Content or any portions or derivative versions thereof to anyone without prior written consent from Kensho. You agree not to contact Kensho or its affiliates concerning individuals whose information may be included in the Content.
1.2 Disclaimer. Content to which you are provided access, either directly or indirectly, from or on this Content will not have been reviewed or monitored by Kensho, and Kensho cannot and does not guarantee or make any representation or warranty, either express or implied, as to the accuracy, validity, timeliness, completeness or continued availability of any such content.
The Content is provided for your convenience only and is not a republication or reconfirmation of the opinion or information contained therein. The provision of the Content is without any obligation on the part of Kensho or its third-party content providers to review such or any liability or responsibility arising out of your use thereof. Kensho does not guarantee or make any representation or warranty, either express or implied, as to the accuracy, validity, timeliness, completeness or continued availability of any Content and shall not be liable for any errors, delays, or actions taken in reliance on information. In addition, the Content speaks only as of the date issued and is based on conference calls that may contain projections of other forward-looking statements. You should not rely on the Content as expressing Kensho’s opinion or as representing current information. None of Kensho or the third-party content providers has undertaken, and do not undertake any duty to update any Content or otherwise advise you of any changes in the Content.
1.3 Ownership of Third-Party Content. You acknowledge that all proprietary rights in the Content that are owned by Kensho or third party content providers shall remain the property of Kensho or such third party content providers, and you shall have no right or interest in such third party content except the rights to use such third party content in accordance with these Terms of Usage. Any additional rights not granted herein shall require a separate, direct agreement with Kensho. You acknowledge that the Content and third party content as compiled, prepared, selected and arranged by Kensho or its third party content providers constitutes an expenditure of substantial time, effort and money by Kensho and its third party content providers and constitutes valuable commercial property and/or trade secrets of Kensho and such third party content providers. Kensho retains all rights and remedies afforded under the copyright, trademark, service mark, patent and other laws of the United States and the States thereof, including without limitation any laws designed to protect proprietary or confidential information. You agree that you will not remove or modify any copyright notice, disclosures, disclaimers or other notification or trade name or marks of Kensho or the third party content providers that may appear in the Content or third party content and that any permitted reproduction and/or distribution of the Content or third party content shall contain such notices and/or marks as they appear in the Content or third party content. You may not use Kensho’s or the third-party content providers’ name or trademarks without the prior written consent of Kensho or such third-party content providers. Apart from the rights granted hereunder, no conveyance of ownership, right, title or interest is intended herein. Any additional rights require a separate agreement with Kensho.
1.4 Posted Guidelines. In addition to these Terms of Usage, when using this Content, you shall be subject to and agree to follow any posted notice, guidelines or rules, which may be posted and amended from time to time. Nothing on this Content shall be considered a recommendation or solicitation to buy or an offer to sell a security to any person in any jurisdiction.
1.5 Registration Data. In consideration of your use of this Content, you and/or your employer agree to: (a) provide true, accurate, current and complete Registration Data (as defined below in Section 3.1) to Kensho as prompted by the registration form completed prior to accessing the Content and (b) maintain and promptly update the Registration Data and to keep the same true, accurate, current and complete.
1.6 Right to Terminate User Access. Kensho reserves the right to limit, restrict and immediately terminate your access to and use of this Content at any time, in whole or in part, in its sole discretion and without notice.
Section 2 - DISCLAIMER OF WARRANTY AND LIMITATION OF LIABILITY
2.1 THE CONTENT IS PROVIDED “AS IS” AND “AS AVAILABLE” WITHOUT REPRESENTATION OR WARRANTY OF ANY KIND. USE OF THE CONTENT IS AT THE USER’S OWN RISK. IN NO EVENT SHALL KENSHO OR ITS THIRD-PARTY CONTENT PROVIDERS BE LIABLE FOR ANY DECISION MADE OR ACTION OR INACTION TAKEN IN RELIANCE ON ANY CONTENT, INCLUDING THIRD-PARTY CONTENT, INCLUDING YOUR HANDLING AND STORING OF THE CONTENT. KENSHO FURTHER EXPLICITLY DISCLAIMS, ANY WARRANTY OF ANY KIND, WHETHER EXPRESS OR IMPLIED, INCLUDING WARRANTIES OF ORIGINALITY, ACCURACY, COMPLETENESS, TIMELINESS, MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. KENSHO EXPRESSLY DISCLAIMS, AND YOU WAIVE, ANY LIABILITY THAT MAY ARISE FROM YOUR PUBLICATION OR PROVISION OF THE CONTENT TO A THIRD PARTY, OR ANY REPRESENTATION OR WARRANTY MADE BY YOU TO ANY THIRD PARTY, WHETHER OR NOT RELATED TO THE CONTENT. KENSHO, SUPPLIERS OF THIRD-PARTY CONTENT AND ANY OTHER THIRD PARTY WORKING WITH KENSHO SHALL NOT BE RESPONSIBLE OR LIABLE, DIRECTLY OR INDIRECTLY, FOR ANY DAMAGES OR LOSS (INCLUDING DIRECT, INDIRECT, INCIDENTAL, CONSEQUENTIAL AND ANY AND ALL OTHER FORMS OF DAMAGES OR LOSSES REGARDLESS OF THE FORM OF THE ACTION OR THE BASIS OF THE CLAIM) CAUSED OR ALLEGED TO BE CAUSED IN CONNECTION WITH YOUR USE OF THE CONTENT WHETHER OR NOT FORESEEABLE, EVEN IF KENSHO OR ANY OF THE SUPPLIERS OF THIRD-PARTY CONTENT OR OTHER THIRD PARTIES WORKING WITH KENSHO IN CONNECTION WITH THE CONTENT HAS BEEN ADVISED OF THE POSSIBILITY OR LIKELIHOOD OF SUCH DAMAGES.
2.2 THE CONTENT IS NOT INTENDED TO PROVIDE TAX, LEGAL, INSURANCE OR INVESTMENT ADVICE, AND NOTHING IN THE CONTENT SHOULD BE CONSTRUED AS AN OFFER TO SELL, A SOLICITATION OF AN OFFER TO BUY, OR A RECOMMENDATION FOR ANY SECURITY BY KENSHO OR ANY THIRD PARTY.
2.3 For third party demands, claims, actions, proceedings and liability for losses, damages, reasonable legal costs and other reasonable expenses of any nature, you agree to defend, indemnify and hold Kensho and its affiliates harmless, including its respective directors, officers, employees and agents from and against all claims to the extent arising from your access to and/or use of the Content, any failure by you to abide by the Terms of Usage, or breach of applicable law.
Section 3 - PRIVACY
3.1 Access and Collection. In order to access this Content, during the registration process, either you or your employer will be required to provide Kensho with certain information; including your name, employer or academic institution, and e-mail address (“Registration Data”). In addition, when you request or view Content, Kensho may obtain user identifiable information related to your request of, or access to, such Content (“Access Data”). For example, while you are accessing this Content, our Web servers may recognize your: (a) domain name; (b) ISP’s domain name; (c) IP address; (d) browser type; and (e) operating system. If you contact us with a technical question, we may collect certain information about your systems, including: (a) your browser type, version and settings (e.g., Java and cookie settings); (b) connectivity information (e.g., SSL/HTTPS compatibility, bandwidth capacity); and browser plug-in information (e.g., do you have Adobe, what is your media player, can you open Flash files, etc.).
3.2 Use of Your Information. Registration Data and Access Data may be used by Kensho for research and development purposes and to communicate with users and to troubleshoot any technical issues pertaining to the Content. You acknowledge that in the event that a separate agreement is required, Kensho may share Registration Data with its Affiliates (as defined below).
3.3 Disclosure of Your Information. Except as otherwise noted herein, Kensho will not disclose, rent or sell personal information collected from or about you without your permission. For the purposes specified in the preceding paragraph, we may transfer or disclose Registration Data and Access Data to S&P Global Inc. and its affiliates (“Kensho Affiliates”) and third parties who are contracted to perform services on behalf of Kensho, such as those who assist Kensho in bringing you this Content and providing you with certain features and functionality included within or accessible via this Content. We may also disclose Registration Data and Access Data to Kensho Affiliates and third parties in connection with their providing you access to this Content. Disclosures to these third parties will be subject to confidentiality agreements and, where required, governed by contract. Kensho may also be required to disclose information to governmental, regulatory or self-regulatory entities or agencies in response to regulatory inquiries or to comply with applicable laws, rules, regulations, orders, subpoenas or other legal processes.
3.4 Consent. By (a) agreeing to these Terms of Usage, or (b) by using this Content, and, in either case, providing any information that may be required, requested or otherwise collected by us as set forth above, you freely consent to Kensho processing your information in the United States and in other countries and territories for the purposes set out in these Terms of Usage, and you also consent to the transfer of your information for such purposes to any third party content provider wherever such entity may from time to time be located and to any third parties as described above and in accordance with applicable law and regulations. If you do not permit Kensho to collect any of your information or do not agree with any of the terms and conditions of these Terms of Usage, you should not use this Content and should exit this page and/or Content, as the case may be. If after registering with Kensho, you desire to withdraw the consent granted in this Section 3.4 for all future use of your information by Kensho, you must notify Kensho in writing at the address listed below in Section 3.8 and immediately cease use of this Content.
3.5 Inquiries. If you have any questions regarding these Terms of Usage or your information that is held by us, please contact Kensho in writing using the contact information provided below. If we receive a request regarding your personal information held by us, we will use reasonable means to provide you with such information that we can reasonably compile. You will be given the opportunity to rectify any inaccuracies in such information.
3.6 Encryption. Kensho may use encryption technology to protect certain transmissions of data to/from this Content, but e-mail and other communications, unless otherwise noted on this Content, are not encrypted to/from this Content. Therefore, you should not send any personal or identifying information, such as account numbers, credit card numbers, Social Security numbers, passwords, etc., to Kensho via e-mail. By utilizing e-mail or other electronic communication means you acknowledge that you have no expectation of privacy with respect to the information delivered thereby and that Kensho will not be responsible for any loss or damage that could result from interception by third parties of any information so sent.
3.7 Contact Information. In the event you have any questions regarding these Terms of Use, this Privacy Statement or to make any requests or queries regarding your information that is held by us you may contact us in writing at privacy@kensho.com or Kensho Technologies LLC, Attn: General Counsel, 55 Water Street, New York, NY 10041.
Section 4 - MISCELLANEOUS
4.1 Entire Agreement. These Terms of Usage constitute the entire agreement of the parties hereto with respect to the subject matter hereof and supersede all prior agreements and undertakings, both written and oral, between the parties with respect to the subject matter hereof.
4.2 Severability. If any term or other provision of these Terms of Usage is invalid, illegal or incapable of being enforced by any law or public policy, all other terms and provisions of these Terms of Usage shall nevertheless remain in full force and effect so long as the economic or legal substance of the transactions contemplated hereby is not affected in any manner materially adverse to any party.
4.3 Governing Law; Forum. These Terms of Usage shall be governed in all respects by the laws of the State of New York, and any litigation arising out of or connected in any way with these Terms of Usage shall take place in a State or Federal court of competent jurisdiction in New York County, State of New York.
4.4 Waiver of Jury Trial. YOU WAIVE TO THE FULLEST EXTENT PERMITTED BY APPLICABLE LAW ANY RIGHT YOU MAY HAVE TO A TRIAL BY JURY WITH RESPECT TO ANY ACTIONS OR PROCEEDINGS DIRECTLY OR INDIRECTLY ARISING OUT OF, UNDER OR IN CONNECTION WITH THESE TERMS OF USAGE.
4.5 Conflict. In the event of a conflict between these Terms of Use and any other agreement with Kensho that relates to Third-Party Content, the more restrictive terms shall prevail.
extra_gated_fields:
Full name: text
Email: text
Institution: text
I accept the Terms of Usage: checkbox
---
# Dataset Card for SPGISpeech
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
- [Terms of Usage](#terms-of-usage)
## Dataset Description
- **Homepage:** https://datasets.kensho.com/datasets/spgispeech
- **Repository:**
- **Paper:** https://arxiv.org/abs/2104.02014
- **Leaderboard:**
- **Point of Contact:** [data@kensho.com](mailto:data@kensho.com )
## Dataset Description
SPGISpeech (rhymes with “squeegee-speech”) is a large-scale transcription dataset, freely available for academic research.
SPGISpeech is a corpus of 5,000 hours of professionally-transcribed financial audio.
SPGISpeech contains a broad cross-section of L1 and L2 English accents,
strongly varying audio quality, and both spontaneous and narrated speech. The transcripts have each been cross-checked
by multiple professional editors for high accuracy and are fully formatted, including capitalization, punctuation, and
denormalization of non-standard words.
SPGISpeech consists of 5,000 hours of recorded company earnings calls and their respective transcriptions.
The original calls were split into slices ranging from 5 to 15 seconds in length to allow easy training for
speech recognition systems. Calls represent a broad cross-section of international business English;
SPGISpeech contains approximately 50,000 speakers, one of the largest numbers of any speech corpus,
and offers a variety of L1 and L2 English accents. The format of each WAV file is single channel, 16kHz, 16 bit audio.
### Example Usage
The training split has several configurations of various size: S, M, L. See the Section [Data Splits](#data-splits)
for for more information. To download the S configuration:
```python
from datasets import load_dataset
spgi = load_dataset("kensho/spgispeech", "S", use_auth_token=True)
# see structure
print(spgi)
# load audio sample on the fly
audio_input = spgi["train"][0]["audio"] # first decoded audio sample
transcription = spgi["train"][0]["text"] # first transcription
```
It is possible to download only the development or test data:
```python
spgi_dev = load_dataset("kensho/spgispeech", "dev", use_auth_token=True)
spgi_test = load_dataset("kensho/spgispeech", "test", use_auth_token=True)
```
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
The model is presented with an audio file and asked to transcribe the audio file to written text.
The most common evaluation metric is the word error rate (WER).
### Languages
SPGISpeech contains audio and transcription data in business English and offers a variety of L1 and L2 accents.
## Dataset Structure
### Data Instances
```python
{
'wav_filename': '32bcf9c9dc707fb61a04290e296f31eb/99.wav',
'audio': {
'path': '/home/user/.cache/huggingface/datasets/downloads/extracted/c7082e2bd5b.../dev_part_2/32bcf9c9dc707fb61a04290e296f31eb/99.wav',
'array': array([-0.00039673, -0.00057983, -0.00057983, ..., -0.0007019 ,
-0.00027466, 0.00021362], dtype=float32),
'sampling_rate': 16000
},
'wav_filesize': 292844,
'transcript': 'This is proving to be true, and through focused execution we are on track to exceed our targeted savings in 2017. As a reminder,'
}
```
### Data Fields
* wav_filename (string) - audio filename (includes parent directory).
* audio (Audio feature) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate.
In non-streaming mode (default), the path points to the locally extracted audio. In streaming mode, the path is the relative path of an audio
inside its archive (as files are not downloaded and extracted locally).
* wav_filesize (int) - size of the file in bytes.
* transcript (string) - transcription of the file.
### Data Splits
The dataset has three splits: train, evaluation (dev) and test. The train split has three configurations of various sizes:
S, M, L. Larger subsets are supersets of smaller subsets, e.g., the L subset contains all the data from the M subset.
#### Transcribed Subsets Size
| Subset | Size |
|:------:|:-------:|
| S | 22Gb |
| M | 107Gb |
| L | 530Gb |
| dev | 11Gb |
| test | 11Gb |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
The dataset contains S&P Global company earnings calls.
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
English speakers with a diverse selection of accents, including non-native ones (L2), producing both
spontaneous and narrated speech.
### Annotations
#### Annotation process
Data is orthographically transcribed according to a professional style guide detailing conventions for capitalization, punctuation,
denormalization of non-standard words and transcription of disfluencies in spontaneous speech.
The transcripts have each been cross-checked by multiple professional editors for high accuracy and are fully formatted.
Full earnings calls last 30-60 minutes in length and are typically
transcribed as whole units, without internal timestamps. In order to produce short audio slices suitable for STT
training, the files were segmented with [Gentle](https://lowerquality.com/gentle/), a double-pass forced aligner,
with the beginning and end of each slice of audio imputed by voice activity detection with
[py-webrtc](https://github.com/wiseman/py-webrtcvad).
#### Who are the annotators?
Earning calls are manually transcribed by S&P Global, Inc.
### Personal and Sensitive Information
Though earnings calls are public, we nevertheless identified full names with the spaCy en core web large model.
We withheld samples containing names that appeared fewer than ten times (7% of total). Full
names appearing ten times or more in the data were considered to be public figures and were retained.
This necessarily incomplete approach to named entity recognition was complemented with randomized manual spot
checks which uncovered no false negatives missed by the automated approach.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
### Citation Information
Please cite this paper:
```bibtext
@ARTICLE{2021arXiv210402014O,
author = {{O'Neill}, Patrick K. and {Lavrukhin}, Vitaly and {Majumdar},
Somshubra and {Noroozi}, Vahid and {Zhang}, Yuekai and {Kuchaiev}, Oleksii and {Balam},
Jagadeesh and {Dovzhenko}, Yuliya and {Freyberg}, Keenan and {Shulman}, Michael D. and {Ginsburg},
Boris and {Watanabe}, Shinji and {Kucsko}, Georg},
title = "{SPGISpeech: 5,000 hours of transcribed financial audio for fully formatted end-to-end speech recognition}",
journal = {arXiv e-prints},
keywords = {Computer Science - Computation and Language, Electrical Engineering and Systems Science - Audio and Speech Processing},
year = 2021,
month = apr,
eid = {arXiv:2104.02014},
pages = {arXiv:2104.02014},
archivePrefix = {arXiv},
eprint = {2104.02014},
primaryClass = {cs.CL},
adsurl = {https://ui.adsabs.harvard.edu/abs/2021arXiv210402014O},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
```
### Contributions
Thanks to [@sanchit-gandhi](https://github.com/sanchit-gandhi), [@patrickvonplaten](https://github.com/patrickvonplaten),
and [@polinaeterna](https://github.com/polinaeterna) for adding this dataset.
## Terms of Usage
Your access to and use of the information in the Kensho Transcript Dataset (the “Content”), which is provided by Kensho Technologies, LLC, a subsidiary of S&P Global, Inc., (“Kensho”), shall be governed by the following terms and conditions of usage (“Terms of Usage”). The Content may be accessed only by persons who have been authorized to use this Content pursuant to their acceptance and acknowledgement of these Terms of Usage (in each case, an “Authorized User”). By providing your electronic signature at the end of these Terms of Usage, you represent that you are an Authorized User and that you accept these Terms of Usage and agree to be bound by them.
If you do not wish to be bound by these Terms of Usage, you must not use this Content. PLEASE READ THESE TERMS OF USAGE CAREFULLY BEFORE USING THIS CONTENT.
Section 1 – THE CONTENT
1.1 The Content is provided for academic research purposes and internal use only and must not be used to:
- assemble or create a database;
- construct or facilitate the construction of products which compete with the Content;
- identify or attempt to identify or contact any individual; or link to another dataset.
The Content, which is comprised of public earnings calls in audio and corresponding text format, and all accompanying derived products is proprietary to Kensho and its third-party content providers. You shall not modify the Content; create derivative works based on the Content, rewrite or reprocess the Content except as expressly provided herein. You must not publish, display, transfer or redistribute the Content or any portions or derivative versions thereof to anyone without prior written consent from Kensho. You agree not to contact Kensho or its affiliates concerning individuals whose information may be included in the Content.
1.2 Disclaimer. Content to which you are provided access, either directly or indirectly, from or on this Content will not have been reviewed or monitored by Kensho, and Kensho cannot and does not guarantee or make any representation or warranty, either express or implied, as to the accuracy, validity, timeliness, completeness or continued availability of any such content.
The Content is provided for your convenience only and is not a republication or reconfirmation of the opinion or information contained therein. The provision of the Content is without any obligation on the part of Kensho or its third-party content providers to review such or any liability or responsibility arising out of your use thereof. Kensho does not guarantee or make any representation or warranty, either express or implied, as to the accuracy, validity, timeliness, completeness or continued availability of any Content and shall not be liable for any errors, delays, or actions taken in reliance on information. In addition, the Content speaks only as of the date issued and is based on conference calls that may contain projections of other forward-looking statements. You should not rely on the Content as expressing Kensho’s opinion or as representing current information. None of Kensho or the third-party content providers has undertaken, and do not undertake any duty to update any Content or otherwise advise you of any changes in the Content.
1.3 Ownership of Third-Party Content. You acknowledge that all proprietary rights in the Content that are owned by Kensho or third party content providers shall remain the property of Kensho or such third party content providers, and you shall have no right or interest in such third party content except the rights to use such third party content in accordance with these Terms of Usage. Any additional rights not granted herein shall require a separate, direct agreement with Kensho. You acknowledge that the Content and third party content as compiled, prepared, selected and arranged by Kensho or its third party content providers constitutes an expenditure of substantial time, effort and money by Kensho and its third party content providers and constitutes valuable commercial property and/or trade secrets of Kensho and such third party content providers. Kensho retains all rights and remedies afforded under the copyright, trademark, service mark, patent and other laws of the United States and the States thereof, including without limitation any laws designed to protect proprietary or confidential information. You agree that you will not remove or modify any copyright notice, disclosures, disclaimers or other notification or trade name or marks of Kensho or the third party content providers that may appear in the Content or third party content and that any permitted reproduction and/or distribution of the Content or third party content shall contain such notices and/or marks as they appear in the Content or third party content. You may not use Kensho’s or the third-party content providers’ name or trademarks without the prior written consent of Kensho or such third-party content providers. Apart from the rights granted hereunder, no conveyance of ownership, right, title or interest is intended herein. Any additional rights require a separate agreement with Kensho.
1.4 Posted Guidelines. In addition to these Terms of Usage, when using this Content, you shall be subject to and agree to follow any posted notice, guidelines or rules, which may be posted and amended from time to time. Nothing on this Content shall be considered a recommendation or solicitation to buy or an offer to sell a security to any person in any jurisdiction.
1.5 Registration Data. In consideration of your use of this Content, you and/or your employer agree to: (a) provide true, accurate, current and complete Registration Data (as defined below in Section 3.1) to Kensho as prompted by the registration form completed prior to accessing the Content and (b) maintain and promptly update the Registration Data and to keep the same true, accurate, current and complete.
1.6 Right to Terminate User Access. Kensho reserves the right to limit, restrict and immediately terminate your access to and use of this Content at any time, in whole or in part, in its sole discretion and without notice.
Section 2 - DISCLAIMER OF WARRANTY AND LIMITATION OF LIABILITY
2.1 THE CONTENT IS PROVIDED “AS IS” AND “AS AVAILABLE” WITHOUT REPRESENTATION OR WARRANTY OF ANY KIND. USE OF THE CONTENT IS AT THE USER’S OWN RISK. IN NO EVENT SHALL KENSHO OR ITS THIRD-PARTY CONTENT PROVIDERS BE LIABLE FOR ANY DECISION MADE OR ACTION OR INACTION TAKEN IN RELIANCE ON ANY CONTENT, INCLUDING THIRD-PARTY CONTENT, INCLUDING YOUR HANDLING AND STORING OF THE CONTENT. KENSHO FURTHER EXPLICITLY DISCLAIMS, ANY WARRANTY OF ANY KIND, WHETHER EXPRESS OR IMPLIED, INCLUDING WARRANTIES OF ORIGINALITY, ACCURACY, COMPLETENESS, TIMELINESS, MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. KENSHO EXPRESSLY DISCLAIMS, AND YOU WAIVE, ANY LIABILITY THAT MAY ARISE FROM YOUR PUBLICATION OR PROVISION OF THE CONTENT TO A THIRD PARTY, OR ANY REPRESENTATION OR WARRANTY MADE BY YOU TO ANY THIRD PARTY, WHETHER OR NOT RELATED TO THE CONTENT. KENSHO, SUPPLIERS OF THIRD-PARTY CONTENT AND ANY OTHER THIRD PARTY WORKING WITH KENSHO SHALL NOT BE RESPONSIBLE OR LIABLE, DIRECTLY OR INDIRECTLY, FOR ANY DAMAGES OR LOSS (INCLUDING DIRECT, INDIRECT, INCIDENTAL, CONSEQUENTIAL AND ANY AND ALL OTHER FORMS OF DAMAGES OR LOSSES REGARDLESS OF THE FORM OF THE ACTION OR THE BASIS OF THE CLAIM) CAUSED OR ALLEGED TO BE CAUSED IN CONNECTION WITH YOUR USE OF THE CONTENT WHETHER OR NOT FORESEEABLE, EVEN IF KENSHO OR ANY OF THE SUPPLIERS OF THIRD-PARTY CONTENT OR OTHER THIRD PARTIES WORKING WITH KENSHO IN CONNECTION WITH THE CONTENT HAS BEEN ADVISED OF THE POSSIBILITY OR LIKELIHOOD OF SUCH DAMAGES.
2.2 THE CONTENT IS NOT INTENDED TO PROVIDE TAX, LEGAL, INSURANCE OR INVESTMENT ADVICE, AND NOTHING IN THE CONTENT SHOULD BE CONSTRUED AS AN OFFER TO SELL, A SOLICITATION OF AN OFFER TO BUY, OR A RECOMMENDATION FOR ANY SECURITY BY KENSHO OR ANY THIRD PARTY.
2.3 For third party demands, claims, actions, proceedings and liability for losses, damages, reasonable legal costs and other reasonable expenses of any nature, you agree to defend, indemnify and hold Kensho and its affiliates harmless, including its respective directors, officers, employees and agents from and against all claims to the extent arising from your access to and/or use of the Content, any failure by you to abide by the Terms of Usage, or breach of applicable law.
Section 3 - PRIVACY
3.1 Access and Collection. In order to access this Content, during the registration process, either you or your employer will be required to provide Kensho with certain information; including your name, employer or academic institution, and e-mail address (“Registration Data”). In addition, when you request or view Content, Kensho may obtain user identifiable information related to your request of, or access to, such Content (“Access Data”). For example, while you are accessing this Content, our Web servers may recognize your: (a) domain name; (b) ISP’s domain name; (c) IP address; (d) browser type; and (e) operating system. If you contact us with a technical question, we may collect certain information about your systems, including: (a) your browser type, version and settings (e.g., Java and cookie settings); (b) connectivity information (e.g., SSL/HTTPS compatibility, bandwidth capacity); and browser plug-in information (e.g., do you have Adobe, what is your media player, can you open Flash files, etc.).
3.2 Use of Your Information. Registration Data and Access Data may be used by Kensho for research and development purposes and to communicate with users and to troubleshoot any technical issues pertaining to the Content. You acknowledge that in the event that a separate agreement is required, Kensho may share Registration Data with its Affiliates (as defined below).
3.3 Disclosure of Your Information. Except as otherwise noted herein, Kensho will not disclose, rent or sell personal information collected from or about you without your permission. For the purposes specified in the preceding paragraph, we may transfer or disclose Registration Data and Access Data to S&P Global Inc. and its affiliates (“Kensho Affiliates”) and third parties who are contracted to perform services on behalf of Kensho, such as those who assist Kensho in bringing you this Content and providing you with certain features and functionality included within or accessible via this Content. We may also disclose Registration Data and Access Data to Kensho Affiliates and third parties in connection with their providing you access to this Content. Disclosures to these third parties will be subject to confidentiality agreements and, where required, governed by contract. Kensho may also be required to disclose information to governmental, regulatory or self-regulatory entities or agencies in response to regulatory inquiries or to comply with applicable laws, rules, regulations, orders, subpoenas or other legal processes.
3.4 Consent. By (a) agreeing to these Terms of Usage, or (b) by using this Content, and, in either case, providing any information that may be required, requested or otherwise collected by us as set forth above, you freely consent to Kensho processing your information in the United States and in other countries and territories for the purposes set out in these Terms of Usage, and you also consent to the transfer of your information for such purposes to any third party content provider wherever such entity may from time to time be located and to any third parties as described above and in accordance with applicable law and regulations. If you do not permit Kensho to collect any of your information or do not agree with any of the terms and conditions of these Terms of Usage, you should not use this Content and should exit this page and/or Content, as the case may be. If after registering with Kensho, you desire to withdraw the consent granted in this Section 3.4 for all future use of your information by Kensho, you must notify Kensho in writing at the address listed below in Section 3.8 and immediately cease use of this Content.
3.5 Inquiries. If you have any questions regarding these Terms of Usage or your information that is held by us, please contact Kensho in writing using the contact information provided below. If we receive a request regarding your personal information held by us, we will use reasonable means to provide you with such information that we can reasonably compile. You will be given the opportunity to rectify any inaccuracies in such information.
3.6 Encryption. Kensho may use encryption technology to protect certain transmissions of data to/from this Content, but e-mail and other communications, unless otherwise noted on this Content, are not encrypted to/from this Content. Therefore, you should not send any personal or identifying information, such as account numbers, credit card numbers, Social Security numbers, passwords, etc., to Kensho via e-mail. By utilizing e-mail or other electronic communication means you acknowledge that you have no expectation of privacy with respect to the information delivered thereby and that Kensho will not be responsible for any loss or damage that could result from interception by third parties of any information so sent.
3.7 Contact Information. In the event you have any questions regarding these Terms of Use, this Privacy Statement or to make any requests or queries regarding your information that is held by us you may contact us in writing at privacy@kensho.com or Kensho Technologies LLC, Attn: General Counsel, 55 Water Street, New York, NY 10041.
Section 4 - MISCELLANEOUS
4.1 Entire Agreement. These Terms of Usage constitute the entire agreement of the parties hereto with respect to the subject matter hereof and supersede all prior agreements and undertakings, both written and oral, between the parties with respect to the subject matter hereof.
4.2 Severability. If any term or other provision of these Terms of Usage is invalid, illegal or incapable of being enforced by any law or public policy, all other terms and provisions of these Terms of Usage shall nevertheless remain in full force and effect so long as the economic or legal substance of the transactions contemplated hereby is not affected in any manner materially adverse to any party.
4.3 Governing Law; Forum. These Terms of Usage shall be governed in all respects by the laws of the State of New York, and any litigation arising out of or connected in any way with these Terms of Usage shall take place in a State or Federal court of competent jurisdiction in New York County, State of New York.
4.4 Waiver of Jury Trial. YOU WAIVE TO THE FULLEST EXTENT PERMITTED BY APPLICABLE LAW ANY RIGHT YOU MAY HAVE TO A TRIAL BY JURY WITH RESPECT TO ANY ACTIONS OR PROCEEDINGS DIRECTLY OR INDIRECTLY ARISING OUT OF, UNDER OR IN CONNECTION WITH THESE TERMS OF USAGE.
4.5 Conflict. In the event of a conflict between these Terms of Use and any other agreement with Kensho that relates to Third-Party Content, the more restrictive terms shall prevail.
|
tanmoy-in/sample_db | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 14921662966
num_examples: 14600000
download_size: 408454699
dataset_size: 14921662966
---
# Dataset Card for "sample_db"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yunosuken/sentiment-train | ---
viewer: true
dataset_info:
homepage: httsp://www.yahoo.co.jp
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 897816
num_examples: 8476
- name: validation
num_bytes: 52805
num_examples: 497
- name: test
num_bytes: 109825
num_examples: 1002
download_size: 601239
dataset_size: 1060446
description: hoge
---
test |
zuhaz/asiapgas | ---
license: mit
---
|
slseanwu/ghcode_python_split_700k | ---
dataset_info:
features:
- name: code
dtype: string
- name: repo_name
dtype: string
- name: path
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: size
dtype: int64
splits:
- name: train
num_bytes: 4966735682
num_examples: 639947
- name: test
num_bytes: 549533747
num_examples: 71106
download_size: 1993230859
dataset_size: 5516269429
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview | ---
pretty_name: Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WebraftAI/synapsellm-7b-mistral-v0.5-preview](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T20:01:18.948310](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview/blob/main/results_2023-12-09T20-01-18.948310.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5441057040654342,\n\
\ \"acc_stderr\": 0.03404499199717172,\n \"acc_norm\": 0.5501066597591592,\n\
\ \"acc_norm_stderr\": 0.034782781683925894,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5516274394366725,\n\
\ \"mc2_stderr\": 0.01504190113817455\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n\
\ \"acc_norm\": 0.5273037542662116,\n \"acc_norm_stderr\": 0.014589589101985994\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5624377614021111,\n\
\ \"acc_stderr\": 0.004950723480149757,\n \"acc_norm\": 0.7650866361282613,\n\
\ \"acc_norm_stderr\": 0.004230782375004432\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296562,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296562\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.0381189098894041,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.0381189098894041\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953593,\n \"\
acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953593\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"\
acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.032087795587867514,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.032087795587867514\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.032449808499900284,\n\
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.032449808499900284\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7064220183486238,\n \"acc_stderr\": 0.019525151122639667,\n \"\
acc_norm\": 0.7064220183486238,\n \"acc_norm_stderr\": 0.019525151122639667\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6835443037974683,\n \"acc_stderr\": 0.030274974880218977,\n \
\ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.030274974880218977\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077785,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077785\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613674,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613674\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.015461169002371544,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.015461169002371544\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631438,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.02720111766692565,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.02720111766692565\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806185,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806185\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.394393741851369,\n\
\ \"acc_stderr\": 0.012482141665631184,\n \"acc_norm\": 0.394393741851369,\n\
\ \"acc_norm_stderr\": 0.012482141665631184\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302954,\n \
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302954\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505415,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505415\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.038581589406855174,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.038581589406855174\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5516274394366725,\n\
\ \"mc2_stderr\": 0.01504190113817455\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22744503411675512,\n \
\ \"acc_stderr\": 0.011546363312548092\n }\n}\n```"
repo_url: https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|arc:challenge|25_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|gsm8k|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hellaswag|10_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T20-01-18.948310.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- '**/details_harness|winogrande|5_2023-12-09T20-01-18.948310.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T20-01-18.948310.parquet'
- config_name: results
data_files:
- split: 2023_12_09T20_01_18.948310
path:
- results_2023-12-09T20-01-18.948310.parquet
- split: latest
path:
- results_2023-12-09T20-01-18.948310.parquet
---
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.5-preview](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:01:18.948310](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview/blob/main/results_2023-12-09T20-01-18.948310.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5441057040654342,
"acc_stderr": 0.03404499199717172,
"acc_norm": 0.5501066597591592,
"acc_norm_stderr": 0.034782781683925894,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5516274394366725,
"mc2_stderr": 0.01504190113817455
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5273037542662116,
"acc_norm_stderr": 0.014589589101985994
},
"harness|hellaswag|10": {
"acc": 0.5624377614021111,
"acc_stderr": 0.004950723480149757,
"acc_norm": 0.7650866361282613,
"acc_norm_stderr": 0.004230782375004432
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296562,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296562
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.0381189098894041,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.0381189098894041
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.032087795587867514,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.032087795587867514
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.032449808499900284,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.032449808499900284
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7064220183486238,
"acc_stderr": 0.019525151122639667,
"acc_norm": 0.7064220183486238,
"acc_norm_stderr": 0.019525151122639667
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.030274974880218977,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.030274974880218977
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077785,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613674,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613674
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371544,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631438,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.027513925683549434,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.027513925683549434
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.02720111766692565,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.02720111766692565
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806185,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806185
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.394393741851369,
"acc_stderr": 0.012482141665631184,
"acc_norm": 0.394393741851369,
"acc_norm_stderr": 0.012482141665631184
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.020212274976302954,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.020212274976302954
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505415,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505415
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.038581589406855174,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.038581589406855174
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5516274394366725,
"mc2_stderr": 0.01504190113817455
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.22744503411675512,
"acc_stderr": 0.011546363312548092
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yuan-sf63/word_label_0.2_64_D | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
splits:
- name: train
num_bytes: 44130140.78143172
num_examples: 71812
- name: validation
num_bytes: 4903895.218568278
num_examples: 7980
download_size: 8294781
dataset_size: 49034036.0
---
# Dataset Card for "word_label_0.2_64_D"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jtatman/databricks-dolly-4k-brainstorm-summary-creative | ---
license: mit
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
- num_bytes: 3853338
- num_examples: 3663
- download_size: 2447514
- dataset_size: 3853338
---
this is a parse down of the esoteric categories in dolly 15k dataset
the size is intentional for processing here on the hub
.::modification of the databricks 15k dataset for on hub processing::.
|
Omkar7/Medical_data | ---
license: apache-2.0
---
|
tyzhu/squad_wrong_title_v4_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 203084
num_examples: 138
- name: validation
num_bytes: 50820
num_examples: 50
download_size: 65070
dataset_size: 253904
---
# Dataset Card for "squad_wrong_title_v4_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Estwld/esconv_llm | ---
dataset_info:
features:
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: strategy
dtype: string
- name: emotion
dtype: string
- name: experience
dtype: string
- name: problem
dtype: string
- name: situation
dtype: string
splits:
- name: train
num_bytes: 3087133
num_examples: 910
- name: validation
num_bytes: 662566
num_examples: 195
- name: test
num_bytes: 669299
num_examples: 195
download_size: 2158864
dataset_size: 4418998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
license: apache-2.0
task_categories:
- text-generation
- text-classification
language:
- en
tags:
- esconv
- empathetic
size_categories:
- 1K<n<10K
---
# ESCONV for LLM
This repository contains a reformatted version of the ESCONV dataset, tailored for seamless integration with Language Model (LLM) training and inference. The original dataset's format posed challenges for direct application in LLM tasks, prompting us to restructure and clean the data.
## Data Restructuring
1. Assigned the `user` role to the `usr`, `assistant` role to the `sys`.
2. Removed the `survey_scor` and 'supporter' fields to streamline the data.
## Data Format
Each entry in the reformatted dataset consists of the following fields:
- conversations: A list of dictionaries, where each dictionary represents a turn in the dialogue and contains:
- role: A string indicating the speaker's role, either user or assistant.
- content: A string containing the dialogue content.
- strategy: A string containing the strategy of current dialogue content, if role is user, strategy is NONE.
- emotion: A string indicating the emotional label associated with the dialogue (corresponds to the emotion_type field in the original dataset).
- situation: A string describing the situational label for the dialogue (corresponds to the situation field in the original dataset).
- problem: A string describing the problem label for the user (corresponds to the problem_type field in the original dataset).
- experience: A string, corresponds to the experience_type field in the original dataset.
## Dataset Statistics
| Dataset | Total Turn | Average Turn | Average Length |
|-------------|------------|--------------|----------------|
| Train | 26,648 | 29.284 | 14.547 |
| Validation | 5,678 | 29.118 | 14.630 |
| Test | 6,039 | 30.969 | 13.756 | |
ThraggBilly/billy_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 56599886.0
num_examples: 833
download_size: 50962974
dataset_size: 56599886.0
---
# Dataset Card for "billy_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 | ---
pretty_name: Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T02:48:34.876063](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1/blob/main/results_2023-09-17T02-48-34.876063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018141778523489933,\n\
\ \"em_stderr\": 0.0013667968592600823,\n \"f1\": 0.0824182046979865,\n\
\ \"f1_stderr\": 0.0019512337351707363,\n \"acc\": 0.30108941444123377,\n\
\ \"acc_stderr\": 0.0072592536452981875\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.018141778523489933,\n \"em_stderr\": 0.0013667968592600823,\n\
\ \"f1\": 0.0824182046979865,\n \"f1_stderr\": 0.0019512337351707363\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.000758150113722541\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873834\n\
\ }\n}\n```"
repo_url: https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T02_48_34.876063
path:
- '**/details_harness|drop|3_2023-09-17T02-48-34.876063.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T02-48-34.876063.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T02_48_34.876063
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-48-34.876063.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-48-34.876063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T02_48_34.876063
path:
- '**/details_harness|winogrande|5_2023-09-17T02-48-34.876063.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T02-48-34.876063.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- results_2023-08-17T19:06:24.257655.parquet
- split: 2023_09_17T02_48_34.876063
path:
- results_2023-09-17T02-48-34.876063.parquet
- split: latest
path:
- results_2023-09-17T02-48-34.876063.parquet
---
# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T02:48:34.876063](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1/blob/main/results_2023-09-17T02-48-34.876063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.018141778523489933,
"em_stderr": 0.0013667968592600823,
"f1": 0.0824182046979865,
"f1_stderr": 0.0019512337351707363,
"acc": 0.30108941444123377,
"acc_stderr": 0.0072592536452981875
},
"harness|drop|3": {
"em": 0.018141778523489933,
"em_stderr": 0.0013667968592600823,
"f1": 0.0824182046979865,
"f1_stderr": 0.0019512337351707363
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.000758150113722541
},
"harness|winogrande|5": {
"acc": 0.601420678768745,
"acc_stderr": 0.013760357176873834
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
idning/ffhq128-caption | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2053430676.0
num_examples: 70000
download_size: 2051404020
dataset_size: 2053430676.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
agil/EUIPO_QA | ---
dataset_info:
features:
- name: ID
dtype: int64
- name: question
dtype: string
- name: source
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 145159.30633802817
num_examples: 227
- name: test
num_bytes: 36449.69366197183
num_examples: 57
download_size: 93579
dataset_size: 181609.0
---
# Dataset Card for "EUIPO_QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-muse256-muse512-wuerst-sdv15/96998511 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 165
num_examples: 10
download_size: 1327
dataset_size: 165
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "96998511"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FaalSa/dfaas2 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57633
num_examples: 1
- name: validation
num_bytes: 58113
num_examples: 1
- name: test
num_bytes: 58593
num_examples: 1
download_size: 20510
dataset_size: 174339
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
CyberHarem/tamaki_iroha_puellamagimadokamagicasidestorymagiarecord | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tamaki Iroha
This is the dataset of Tamaki Iroha, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 694 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 694 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 694 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 694 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
infinex/LaMini-en-id-sampled | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 28634119.980589807
num_examples: 69999
- name: test
num_bytes: 12271941.019410195
num_examples: 30000
download_size: 24029254
dataset_size: 40906061.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
harshiv/placement | ---
license: unknown
---
|
Chapad0o/Ursos_sem_curso | ---
license: openrail
---
|
open-llm-leaderboard/details_giraffe176__Starling_Monarch_Westlake_Garten-7B-v0.1 | ---
pretty_name: Evaluation run of giraffe176/Starling_Monarch_Westlake_Garten-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [giraffe176/Starling_Monarch_Westlake_Garten-7B-v0.1](https://huggingface.co/giraffe176/Starling_Monarch_Westlake_Garten-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Starling_Monarch_Westlake_Garten-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T22:54:03.585658](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Starling_Monarch_Westlake_Garten-7B-v0.1/blob/main/results_2024-03-15T22-54-03.585658.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6566264275245318,\n\
\ \"acc_stderr\": 0.03194960876689088,\n \"acc_norm\": 0.6557673105113733,\n\
\ \"acc_norm_stderr\": 0.03262280468155444,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6792045908527932,\n\
\ \"mc2_stderr\": 0.014999758484035728\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6936860068259386,\n \"acc_stderr\": 0.013470584417276514,\n\
\ \"acc_norm\": 0.7175767918088737,\n \"acc_norm_stderr\": 0.013155456884097224\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7045409281019717,\n\
\ \"acc_stderr\": 0.004553164013379556,\n \"acc_norm\": 0.8814977096195977,\n\
\ \"acc_norm_stderr\": 0.0032254141192897133\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\
\ \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n\
\ \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.01275015180292244,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.01275015180292244\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6792045908527932,\n\
\ \"mc2_stderr\": 0.014999758484035728\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433535\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7194844579226687,\n \
\ \"acc_stderr\": 0.012374608490929554\n }\n}\n```"
repo_url: https://huggingface.co/giraffe176/Starling_Monarch_Westlake_Garten-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|arc:challenge|25_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|gsm8k|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hellaswag|10_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-54-03.585658.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T22-54-03.585658.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- '**/details_harness|winogrande|5_2024-03-15T22-54-03.585658.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T22-54-03.585658.parquet'
- config_name: results
data_files:
- split: 2024_03_15T22_54_03.585658
path:
- results_2024-03-15T22-54-03.585658.parquet
- split: latest
path:
- results_2024-03-15T22-54-03.585658.parquet
---
# Dataset Card for Evaluation run of giraffe176/Starling_Monarch_Westlake_Garten-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Starling_Monarch_Westlake_Garten-7B-v0.1](https://huggingface.co/giraffe176/Starling_Monarch_Westlake_Garten-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Starling_Monarch_Westlake_Garten-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T22:54:03.585658](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Starling_Monarch_Westlake_Garten-7B-v0.1/blob/main/results_2024-03-15T22-54-03.585658.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6566264275245318,
"acc_stderr": 0.03194960876689088,
"acc_norm": 0.6557673105113733,
"acc_norm_stderr": 0.03262280468155444,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6792045908527932,
"mc2_stderr": 0.014999758484035728
},
"harness|arc:challenge|25": {
"acc": 0.6936860068259386,
"acc_stderr": 0.013470584417276514,
"acc_norm": 0.7175767918088737,
"acc_norm_stderr": 0.013155456884097224
},
"harness|hellaswag|10": {
"acc": 0.7045409281019717,
"acc_stderr": 0.004553164013379556,
"acc_norm": 0.8814977096195977,
"acc_norm_stderr": 0.0032254141192897133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.01665722942458631,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.01665722942458631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.01275015180292244,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.01275015180292244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6792045908527932,
"mc2_stderr": 0.014999758484035728
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433535
},
"harness|gsm8k|5": {
"acc": 0.7194844579226687,
"acc_stderr": 0.012374608490929554
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
one-sec-cv12/chunk_120 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 26957119824.125
num_examples: 280663
download_size: 25198915226
dataset_size: 26957119824.125
---
# Dataset Card for "chunk_120"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceH4/deita-6k-v0-sft | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 282384543.6
num_examples: 5700
- name: test_sft
num_bytes: 14862344.4
num_examples: 300
- name: train_gen
num_bytes: 276218301
num_examples: 5700
- name: test_gen
num_bytes: 13232842
num_examples: 300
download_size: 232332840
dataset_size: 586698031.0
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
liuyanchen1015/MULTI_VALUE_rte_participle_past_tense | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 199017
num_examples: 488
- name: train
num_bytes: 155512
num_examples: 361
download_size: 235435
dataset_size: 354529
---
# Dataset Card for "MULTI_VALUE_rte_participle_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vedica1011/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4049648
num_examples: 1000
download_size: 2170714
dataset_size: 4049648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nxgiz/test | ---
license: mit
---
|
irds/mmarco_v2_es | ---
pretty_name: '`mmarco/v2/es`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/es`
The `mmarco/v2/es` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/es).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`mmarco_v2_es_dev`](https://huggingface.co/datasets/irds/mmarco_v2_es_dev), [`mmarco_v2_es_train`](https://huggingface.co/datasets/irds/mmarco_v2_es_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mmarco_v2_es', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
CyberHarem/arisugawa_natsuha_theidolmstershinycolors | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of arisugawa_natsuha/有栖川夏葉/아리스가와나츠하 (THE iDOLM@STER: SHINY COLORS)
This is the dataset of arisugawa_natsuha/有栖川夏葉/아리스가와나츠하 (THE iDOLM@STER: SHINY COLORS), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, bangs, orange_hair, large_breasts, red_hair, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 956.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/arisugawa_natsuha_theidolmstershinycolors/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 463.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/arisugawa_natsuha_theidolmstershinycolors/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1305 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/arisugawa_natsuha_theidolmstershinycolors/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 810.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/arisugawa_natsuha_theidolmstershinycolors/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1305 | 1.60 GiB | [Download](https://huggingface.co/datasets/CyberHarem/arisugawa_natsuha_theidolmstershinycolors/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/arisugawa_natsuha_theidolmstershinycolors',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 40 |  |  |  |  |  | rabbit_ears, 1girl, fake_animal_ears, playboy_bunny, detached_collar, wrist_cuffs, looking_at_viewer, strapless_leotard, red_leotard, solo, ponytail, red_bowtie, fishnet_pantyhose, cleavage, bare_shoulders, blush, nail_polish, black_pantyhose, black_eyes, smile, jewelry, medium_breasts, rabbit_tail, card |
| 1 | 18 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, white_bikini, blush, navel, o-ring_bikini, solo, collarbone, criss-cross_halter, sunglasses, bare_shoulders, eyewear_on_head, smile, ahoge, black_eyes, medium_breasts, bracelet, open_mouth, tinted_eyewear, see-through |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, brown_eyes, cleavage, collarbone, looking_at_viewer, outdoors, solo, blue_sky, blush, day, navel, o-ring_bikini, water, wet, white_bikini, criss-cross_halter, ocean, open_mouth, :d, bracelet, brown_hair, closed_mouth, cowboy_shot, standing, stomach, sunlight, thighs, wading |
| 3 | 14 |  |  |  |  |  | 1girl, solo, yellow_bikini, cleavage, looking_at_viewer, navel, sunglasses, bare_shoulders, blush, eyewear_on_head, smile, blue_shorts, collarbone, crop_top, bracelet, denim_shorts, midriff, outdoors, short_shorts, grey_eyes, ocean, ahoge, single_hair_bun, armpits, bikini_under_clothes, black_eyes, cloud, day, sky, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, beach, blue_sky, blush, collarbone, completely_nude, day, navel, nipples, ocean, outdoors, pussy, solo, armpits, arms_behind_head, arms_up, ass_visible_through_thighs, cleft_of_venus, eyewear_on_head, looking_at_viewer, open_mouth, smile, sunglasses, water, wet, ;d, ahoge, one_eye_closed, black_eyes, swept_bangs |
| 5 | 15 |  |  |  |  |  | 1girl, cleavage, solo, collarbone, looking_at_viewer, white_background, blush, simple_background, smile, white_shirt, necklace, short_hair, upper_body, bra |
| 6 | 6 |  |  |  |  |  | 1girl, blush, bridal_veil, cleavage, looking_at_viewer, necklace, wedding_dress, white_dress, bare_shoulders, collarbone, medium_breasts, rose, smile, solo, bride, hair_flower, off-shoulder_dress, white_gloves, black_eyes, bridal_gauntlets, brown_eyes, white_flower |
| 7 | 31 |  |  |  |  |  | 1girl, blush, hetero, nipples, 1boy, swept_bangs, sex, sweat, completely_nude, solo_focus, looking_at_viewer, vaginal, penis, pussy, collarbone, navel, female_pubic_hair, spread_legs, grey_eyes, mosaic_censoring, thighs, girl_on_top, straddling, ahoge, on_bed, on_back, open_mouth, blur_censor |
| 8 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, solo_focus, fellatio, jewelry, mosaic_censoring, nude, sweat, male_pubic_hair, open_mouth, saliva, tongue_out, ahoge, collarbone, cum |
| 9 | 11 |  |  |  |  |  | 1girl, black_gloves, china_dress, looking_at_viewer, blush, bun_cover, double_bun, jewelry, purple_dress, solo, bare_shoulders, cleavage_cutout, smile, arm_garter, ahoge, medium_breasts, sleeveless_dress, swept_bangs, closed_mouth, pelvic_curtain, thighs, baozi, bridal_garter, brown_hair, covered_navel, groin, simple_background, white_background |
| 10 | 9 |  |  |  |  |  | 1girl, blush, frills, looking_at_viewer, maid_headdress, puffy_short_sleeves, solo, wrist_cuffs, black_dress, simple_background, white_apron, white_background, enmaided, waist_apron, smile, swept_bangs, white_thighhighs |
| 11 | 9 |  |  |  |  |  | 1girl, blush, short_sleeves, solo, ahoge, looking_at_viewer, bare_shoulders, jewelry, open_mouth, shirt, skirt, smile, floral_print, food, shoulder_cutout, sweat, upper_body |
| 12 | 6 |  |  |  |  |  | 1girl, jewelry, looking_at_viewer, shirt, smile, solo, ahoge, blush, long_sleeves, collarbone, holding_cup, upper_body, wine_glass, brown_eyes, nail_polish, see-through, sitting |
| 13 | 7 |  |  |  |  |  | looking_at_viewer, short_sleeves, 1girl, grey_shirt, jacket_around_waist, solo, wrist_scrunchie, :d, open_mouth, pleated_skirt, purple_skirt, sweater_around_waist, miniskirt, wavy_hair, white_background, white_thighhighs, collarbone, cowboy_shot, hair_bow, holding, plaid, purple_necktie, purple_scrunchie, simple_background, single_thighhigh |
| 14 | 5 |  |  |  |  |  | 1girl, floral_print, hair_flower, looking_at_viewer, obi, smile, solo, wide_sleeves, ahoge, black_eyes, blush, holding, long_sleeves, outdoors, print_kimono, hair_bun, new_year, snowing, tree, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | rabbit_ears | 1girl | fake_animal_ears | playboy_bunny | detached_collar | wrist_cuffs | looking_at_viewer | strapless_leotard | red_leotard | solo | ponytail | red_bowtie | fishnet_pantyhose | cleavage | bare_shoulders | blush | nail_polish | black_pantyhose | black_eyes | smile | jewelry | medium_breasts | rabbit_tail | card | white_bikini | navel | o-ring_bikini | collarbone | criss-cross_halter | sunglasses | eyewear_on_head | ahoge | bracelet | open_mouth | tinted_eyewear | see-through | brown_eyes | outdoors | blue_sky | day | water | wet | ocean | :d | brown_hair | closed_mouth | cowboy_shot | standing | stomach | sunlight | thighs | wading | yellow_bikini | blue_shorts | crop_top | denim_shorts | midriff | short_shorts | grey_eyes | single_hair_bun | armpits | bikini_under_clothes | cloud | sky | white_background | beach | completely_nude | nipples | pussy | arms_behind_head | arms_up | ass_visible_through_thighs | cleft_of_venus | ;d | one_eye_closed | swept_bangs | simple_background | white_shirt | necklace | short_hair | upper_body | bra | bridal_veil | wedding_dress | white_dress | rose | bride | hair_flower | off-shoulder_dress | white_gloves | bridal_gauntlets | white_flower | hetero | 1boy | sex | sweat | solo_focus | vaginal | penis | female_pubic_hair | spread_legs | mosaic_censoring | girl_on_top | straddling | on_bed | on_back | blur_censor | fellatio | nude | male_pubic_hair | saliva | tongue_out | cum | black_gloves | china_dress | bun_cover | double_bun | purple_dress | cleavage_cutout | arm_garter | sleeveless_dress | pelvic_curtain | baozi | bridal_garter | covered_navel | groin | frills | maid_headdress | puffy_short_sleeves | black_dress | white_apron | enmaided | waist_apron | white_thighhighs | short_sleeves | shirt | skirt | floral_print | food | shoulder_cutout | long_sleeves | holding_cup | wine_glass | sitting | grey_shirt | jacket_around_waist | wrist_scrunchie | pleated_skirt | purple_skirt | sweater_around_waist | miniskirt | wavy_hair | hair_bow | holding | plaid | purple_necktie | purple_scrunchie | single_thighhigh | obi | wide_sleeves | print_kimono | hair_bun | new_year | snowing | tree |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------|:--------|:-------------------|:----------------|:------------------|:--------------|:--------------------|:--------------------|:--------------|:-------|:-----------|:-------------|:--------------------|:-----------|:-----------------|:--------|:--------------|:------------------|:-------------|:--------|:----------|:-----------------|:--------------|:-------|:---------------|:--------|:----------------|:-------------|:---------------------|:-------------|:------------------|:--------|:-----------|:-------------|:-----------------|:--------------|:-------------|:-----------|:-----------|:------|:--------|:------|:--------|:-----|:-------------|:---------------|:--------------|:-----------|:----------|:-----------|:---------|:---------|:----------------|:--------------|:-----------|:---------------|:----------|:---------------|:------------|:------------------|:----------|:-----------------------|:--------|:------|:-------------------|:--------|:------------------|:----------|:--------|:-------------------|:----------|:-----------------------------|:-----------------|:-----|:-----------------|:--------------|:--------------------|:--------------|:-----------|:-------------|:-------------|:------|:--------------|:----------------|:--------------|:-------|:--------|:--------------|:---------------------|:---------------|:-------------------|:---------------|:---------|:-------|:------|:--------|:-------------|:----------|:--------|:--------------------|:--------------|:-------------------|:--------------|:-------------|:---------|:----------|:--------------|:-----------|:-------|:------------------|:---------|:-------------|:------|:---------------|:--------------|:------------|:-------------|:---------------|:------------------|:-------------|:-------------------|:-----------------|:--------|:----------------|:----------------|:--------|:---------|:-----------------|:----------------------|:--------------|:--------------|:-----------|:--------------|:-------------------|:----------------|:--------|:--------|:---------------|:-------|:------------------|:---------------|:--------------|:-------------|:----------|:-------------|:----------------------|:------------------|:----------------|:---------------|:-----------------------|:------------|:------------|:-----------|:----------|:--------|:-----------------|:-------------------|:-------------------|:------|:---------------|:---------------|:-----------|:-----------|:----------|:-------|
| 0 | 40 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | | X | | | | | X | | | X | | | | X | X | X | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | | X | | | | | X | | | X | | | | X | X | X | | | | | | | | | X | X | X | X | X | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | | X | | | | | X | | | X | | | | X | X | X | | | X | X | | | | | | X | | X | | X | X | X | X | | | | | X | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | X | | | | | X | | | X | | | | | | X | | | X | X | | | | | | X | | X | | X | X | X | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 15 |  |  |  |  |  | | X | | | | | X | | | X | | | | X | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | | X | | | | | X | | | X | | | | X | X | X | | | X | X | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 31 |  |  |  |  |  | | X | | | | | X | | | | | | | | | X | | | | | | | | | | X | | X | | | | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | | X | | | | | | | | | | | | | | X | | | | | X | | | | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | X | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 11 |  |  |  |  |  | | X | | | | | X | | | X | | | | | X | X | | | | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | | X | | | | X | X | | | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 9 |  |  |  |  |  | | X | | | | | X | | | X | | | | | X | X | | | | X | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 6 |  |  |  |  |  | | X | | | | | X | | | X | | | | | | X | X | | | X | X | | | | | | | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 14 | 5 |  |  |  |  |  | | X | | | | | X | | | X | | | | | | X | | | X | X | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1 | ---
pretty_name: Evaluation run of 222gate/Blurdus-7b-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [222gate/Blurdus-7b-v0.1](https://huggingface.co/222gate/Blurdus-7b-v0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T04:02:50.944739](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1/blob/main/results_2024-01-21T04-02-50.944739.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537793068429774,\n\
\ \"acc_stderr\": 0.03204806721727468,\n \"acc_norm\": 0.6535097790386686,\n\
\ \"acc_norm_stderr\": 0.03271036283162906,\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6971802454568737,\n\
\ \"mc2_stderr\": 0.015138148073785463\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7225652260505875,\n\
\ \"acc_stderr\": 0.004468178273665677,\n \"acc_norm\": 0.8849830711013742,\n\
\ \"acc_norm_stderr\": 0.0031839033919416975\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n\
\ \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n\
\ \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n\
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n\
\ \"acc_stderr\": 0.016669799592112032,\n \"acc_norm\": 0.46033519553072627,\n\
\ \"acc_norm_stderr\": 0.016669799592112032\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.01274724896707907,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.01274724896707907\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487043,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6971802454568737,\n\
\ \"mc2_stderr\": 0.015138148073785463\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825907\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \
\ \"acc_stderr\": 0.012864471384836705\n }\n}\n```"
repo_url: https://huggingface.co/222gate/Blurdus-7b-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-02-50.944739.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- '**/details_harness|winogrande|5_2024-01-21T04-02-50.944739.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T04-02-50.944739.parquet'
- config_name: results
data_files:
- split: 2024_01_21T04_02_50.944739
path:
- results_2024-01-21T04-02-50.944739.parquet
- split: latest
path:
- results_2024-01-21T04-02-50.944739.parquet
---
# Dataset Card for Evaluation run of 222gate/Blurdus-7b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [222gate/Blurdus-7b-v0.1](https://huggingface.co/222gate/Blurdus-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T04:02:50.944739](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1/blob/main/results_2024-01-21T04-02-50.944739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6537793068429774,
"acc_stderr": 0.03204806721727468,
"acc_norm": 0.6535097790386686,
"acc_norm_stderr": 0.03271036283162906,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6971802454568737,
"mc2_stderr": 0.015138148073785463
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.01338502163731357,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059376
},
"harness|hellaswag|10": {
"acc": 0.7225652260505875,
"acc_stderr": 0.004468178273665677,
"acc_norm": 0.8849830711013742,
"acc_norm_stderr": 0.0031839033919416975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.016669799592112032,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.016669799592112032
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.01274724896707907,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.01274724896707907
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487043,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6971802454568737,
"mc2_stderr": 0.015138148073785463
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825907
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836705
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dhuynh95/Magicoder-Evol-Instruct-1000-CodeLlama-70b-tokenized-0.5-Special-Token | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2235620
num_examples: 1000
download_size: 1123241
dataset_size: 2235620
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
megatomik/FAdataset | ---
license: unknown
---
Scraped images from Furaffinity and their corresponding (preprocessed) tags. May contain adult content. |
bloyal/uniref50-1M | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: special_tokens_mask
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 3084000000
num_examples: 1000000
- name: validation
num_bytes: 154200000
num_examples: 50000
- name: test
num_bytes: 154200000
num_examples: 50000
download_size: 181091871
dataset_size: 3392400000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
shuoshuo0829/155151 | ---
license: apache-2.0
---
|
AdapterOcean/augmentatio-standardized_cluster_6_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 5398233
num_examples: 2752
download_size: 1995939
dataset_size: 5398233
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_6_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atipico1/webq-top5_preprocessed | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: query_embedding
sequence: float32
splits:
- name: train
num_bytes: 24561156
num_examples: 3778
- name: test
num_bytes: 13226950
num_examples: 2032
download_size: 33063836
dataset_size: 37788106
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
davanstrien/haiku-dpo | Invalid username or password. |
open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO-preview | ---
pretty_name: Evaluation run of 0-hero/Matter-0.1-7B-boost-DPO-preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0-hero/Matter-0.1-7B-boost-DPO-preview](https://huggingface.co/0-hero/Matter-0.1-7B-boost-DPO-preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO-preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T09:01:45.587641](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO-preview/blob/main/results_2024-03-22T09-01-45.587641.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6208780416068335,\n\
\ \"acc_stderr\": 0.03276200790622768,\n \"acc_norm\": 0.6241469474445721,\n\
\ \"acc_norm_stderr\": 0.0334142785449673,\n \"mc1\": 0.42105263157894735,\n\
\ \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5885916212997572,\n\
\ \"mc2_stderr\": 0.015413433354723878\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349812,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756564\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6414060944035053,\n\
\ \"acc_stderr\": 0.0047860751075721845,\n \"acc_norm\": 0.8287193786098387,\n\
\ \"acc_norm_stderr\": 0.003759840127150708\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895525,\n \"\
acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099864,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099864\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286462,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286462\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657574,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657574\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.02519018132760841,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.02519018132760841\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.01661139368726859,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.01661139368726859\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657114,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657114\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.01935336054755369,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.01935336054755369\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42105263157894735,\n\
\ \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5885916212997572,\n\
\ \"mc2_stderr\": 0.015413433354723878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011874\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5041698256254739,\n \
\ \"acc_stderr\": 0.01377200577479154\n }\n}\n```"
repo_url: https://huggingface.co/0-hero/Matter-0.1-7B-boost-DPO-preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|arc:challenge|25_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|gsm8k|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hellaswag|10_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T09-01-45.587641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T09-01-45.587641.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- '**/details_harness|winogrande|5_2024-03-22T09-01-45.587641.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T09-01-45.587641.parquet'
- config_name: results
data_files:
- split: 2024_03_22T09_01_45.587641
path:
- results_2024-03-22T09-01-45.587641.parquet
- split: latest
path:
- results_2024-03-22T09-01-45.587641.parquet
---
# Dataset Card for Evaluation run of 0-hero/Matter-0.1-7B-boost-DPO-preview
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0-hero/Matter-0.1-7B-boost-DPO-preview](https://huggingface.co/0-hero/Matter-0.1-7B-boost-DPO-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO-preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T09:01:45.587641](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.1-7B-boost-DPO-preview/blob/main/results_2024-03-22T09-01-45.587641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6208780416068335,
"acc_stderr": 0.03276200790622768,
"acc_norm": 0.6241469474445721,
"acc_norm_stderr": 0.0334142785449673,
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5885916212997572,
"mc2_stderr": 0.015413433354723878
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349812,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756564
},
"harness|hellaswag|10": {
"acc": 0.6414060944035053,
"acc_stderr": 0.0047860751075721845,
"acc_norm": 0.8287193786098387,
"acc_norm_stderr": 0.003759840127150708
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099864,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099864
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286462,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286462
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657574,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.02519018132760841,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.02519018132760841
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.01661139368726859,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.01661139368726859
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657114,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.01935336054755369,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.01935336054755369
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5885916212997572,
"mc2_stderr": 0.015413433354723878
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011874
},
"harness|gsm8k|5": {
"acc": 0.5041698256254739,
"acc_stderr": 0.01377200577479154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_136 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1161938388
num_examples: 228189
download_size: 1184197216
dataset_size: 1161938388
---
# Dataset Card for "chunk_136"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/3d-school_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 7611567
num_examples: 10000
download_size: 824978
dataset_size: 7611567
---
# Dataset Card for "3d-school_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-c793f9-1654758678 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@opfaffel@gmail.com](https://huggingface.co/opfaffel@gmail.com) for evaluating this model. |
wenhu/TheoremQA | ---
language:
- en
license: mit
size_categories:
- n<1K
task_categories:
- question-answering
pretty_name: ThoeremQA
tags:
- question answering
- math
- science
- visual question answering
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Answer_type
dtype: string
- name: Picture
dtype: image
splits:
- name: train
num_bytes: 5025005.0
num_examples: 800
- name: test
num_bytes: 5025005.0
num_examples: 800
download_size: 9898950
dataset_size: 10050010.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
## Introduction
We propose the first question-answering dataset driven by STEM theorems. We annotated 800 QA pairs covering 350+ theorems spanning across Math, EE&CS, Physics and Finance. The dataset is collected by human experts with very high quality. We provide the dataset as a new benchmark to test the limit of large language models to apply theorems to solve challenging university-level questions. We provide a pipeline in the following to prompt LLMs and evaluate their outputs with WolframAlpha.
## How to use TheoremQA
```
from datasets import load_dataset
dataset = load_dataset("wenhu/TheoremQA")
for d in dataset['test']:
print(d)
```
## Arxiv Paper:
https://arxiv.org/abs/2305.12524
## Code
https://github.com/wenhuchen/TheoremQA/tree/main |
LeoLM/MMLU_de | ---
license: mit
---
# Massive Multitask Language Understanding (MMLU) in German
This dataset is to be used for the evaluation of LLM German language understanding.
It is based on the hendrycksTest dataset ([here](https://huggingface.co/datasets/cais/mmlu) and [here](https://huggingface.co/datasets/tasksource/mmlu)) and was created
by using the GPT-3.5 API to translate the entire test set and a few examples of the validation set. To make sure the answer options follow the intended sentence structure
and are always of the correct format, GPT was prompted to output in a JSON format. This came with some complications that were later manually fixed.
The prompt used to translate a single example was the following:
```
insert prompt here @TODO
```
This translation cost a total of ~13€ including iterating on the prompt and fixing broken examples.
|
zolak/twitter_dataset_80_1713099984 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3182129
num_examples: 7964
download_size: 1621192
dataset_size: 3182129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tdh87/STYLEdBiGGEr | ---
license: apache-2.0
---
|
satwikapaul/braille_2 | ---
license: openrail
---
|
0x7o/ruTextNorm-data | ---
dataset_info:
features:
- name: full
dtype: string
- name: short
dtype: string
splits:
- name: train
num_bytes: 245430258
num_examples: 761435
download_size: 135847160
dataset_size: 245430258
---
# Dataset Card for "ruTextNorm-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coref-data/mmc_indiscrim | ---
dataset_info:
- config_name: mmc_en
features:
- name: sentences
list:
- name: id
dtype: int64
- name: misc
struct:
- name: parse_tree
dtype: string
- name: speaker
dtype: string
- name: text
dtype: string
- name: tokens
list:
- name: deprel
dtype: string
- name: end_char
dtype: int64
- name: feats
dtype: string
- name: head
dtype: int64
- name: id
dtype: int64
- name: lemma
dtype: string
- name: misc
dtype: string
- name: start_char
dtype: int64
- name: text
dtype: string
- name: upos
dtype: string
- name: xpos
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: comment
dtype: string
splits:
- name: train
num_bytes: 32714450
num_examples: 955
- name: validation
num_bytes: 4684074
num_examples: 134
- name: test
num_bytes: 3576454
num_examples: 133
download_size: 8195117
dataset_size: 40974978
- config_name: mmc_fa
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: string
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: comment
dtype: string
splits:
- name: train
num_bytes: 8511917
num_examples: 950
- name: validation
num_bytes: 1308706
num_examples: 134
- name: test
num_bytes: 959400
num_examples: 133
download_size: 3083246
dataset_size: 10780023
- config_name: mmc_fa_corrected
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: string
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: comment
dtype: string
splits:
- name: train
num_bytes: 8511917
num_examples: 950
- name: validation
num_bytes: 1308706
num_examples: 134
- name: test
num_bytes: 988920
num_examples: 133
download_size: 3086246
dataset_size: 10809543
- config_name: mmc_zh_corrected
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: string
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: comment
dtype: string
splits:
- name: train
num_bytes: 8024979
num_examples: 948
- name: validation
num_bytes: 1217704
num_examples: 134
- name: test
num_bytes: 765302
num_examples: 133
download_size: 2653472
dataset_size: 10007985
- config_name: mmc_zh_uncorrected
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: string
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: comment
dtype: string
splits:
- name: train
num_bytes: 8024979
num_examples: 948
- name: validation
num_bytes: 1217704
num_examples: 134
- name: test
num_bytes: 926344
num_examples: 133
download_size: 2655536
dataset_size: 10169027
configs:
- config_name: mmc_en
data_files:
- split: train
path: mmc_en/train-*
- split: validation
path: mmc_en/validation-*
- split: test
path: mmc_en/test-*
- config_name: mmc_fa
data_files:
- split: train
path: mmc_fa/train-*
- split: validation
path: mmc_fa/validation-*
- split: test
path: mmc_fa/test-*
- config_name: mmc_fa_corrected
data_files:
- split: train
path: mmc_fa_corrected/train-*
- split: validation
path: mmc_fa_corrected/validation-*
- split: test
path: mmc_fa_corrected/test-*
- config_name: mmc_zh_corrected
data_files:
- split: train
path: mmc_zh_corrected/train-*
- split: validation
path: mmc_zh_corrected/validation-*
- split: test
path: mmc_zh_corrected/test-*
- config_name: mmc_zh_uncorrected
data_files:
- split: train
path: mmc_zh_uncorrected/train-*
- split: validation
path: mmc_zh_uncorrected/validation-*
- split: test
path: mmc_zh_uncorrected/test-*
---
This dataset was generated by reformatting [`coref-data/mmc_raw`](https://huggingface.co/datasets/coref-data/mmc_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
|
arubenruben/cnn_dailymail_google_translator | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 43257744
num_examples: 10000
- name: validation
num_bytes: 19194954
num_examples: 5000
- name: test
num_bytes: 45552717
num_examples: 10000
download_size: 64070699
dataset_size: 108005415
task_categories:
- summarization
- translation
language:
- pt
tags:
- Machine Translation
pretty_name: Portuguese CNN-Dailymail-Google
---
# Dataset Card for "cnn_dailymail_google_translator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-muse256-muse512-wuerst-sdv15/707d50d0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 150
num_examples: 10
download_size: 1285
dataset_size: 150
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "707d50d0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osunlp/TableInstruct | ---
license: cc-by-4.0
language:
- en
size_categories:
- 1M<n<10M
---
---
# TableLlama: Towards Open Large Generalist Models for Tables
Project Page: [https://osu-nlp-group.github.io/TableLlama/](https://osu-nlp-group.github.io/TableLlama/)
Paper: [https://arxiv.org/abs/2311.09206](https://arxiv.org/abs/2311.09206)
Model: [https://huggingface.co/osunlp/TableLlama/](https://huggingface.co/osunlp/TableLlama/)
Code: [https://osu-nlp-group.github.io/TableLlama/](https://osu-nlp-group.github.io/TableLlama/)
## Introduction
We introduce TableLlama, an open-source large generalist model specifically tailored for various table-based tasks. The TableLlama model is trained on TableInstruct Dataset, a meticulously curated instruction tuning dataset for tables. TableLlama is tuned on 2.6 million table-based task data, and can handle up to 8K context!
## Model
🤗 [TableLlama-7B](https://huggingface.co/osunlp/TableLlama/)
## Data
The models are trained on the 🤗 [TableInstruct Dataset](https://huggingface.co/datasets/osunlp/TableInstruct), which includes a comprehensive table-based instruction tuning dataset that covers a variety of real-world tables and realistic tasks. We include 14 datasets of 11 tasks in total. Check out the dataset card for more details.
## Training Procedure
The models are fine-tuned with the TableInstruct dataset using LongLoRA (7B), fully fine-tuning version as the base model, which replaces the vanilla attention mechanism of the original Llama-2 (7B) with shift short attention. The training takes 9 days on a 48*A100 cluster. Check out our paper for more details.
## Evaluation
The models are evaluated on 8 in-domain datasets of 8 tasks and 6 out-of-domain datasets of 4 tasks.
## Usage
You can use the models through Huggingface's Transformers library.
Check our Github repo for more advanced use: [https://osu-nlp-group.github.io/TableLlama/](https://osu-nlp-group.github.io/TableLlama/)
## Prompt Format
```
Below is an instruction that describes a task, paired with an input that provides further context. Write a response that
appropriately completes the request.
### Instruction:
{instruction}
### Input:
{input}
### Question:
{question}
### Response:
```
## Citation
If you use the models, data, or code from this project, please cite the original paper:
```
@misc{zhang2023tablellama,
title={TableLlama: Towards Open Large Generalist Models for Tables},
author={Tianshu Zhang and Xiang Yue and Yifei Li and Huan Sun},
year={2023},
eprint={2311.09206},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
breadlicker45/rlhf-prompt | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 48283331
num_examples: 36768
download_size: 3956825
dataset_size: 48283331
---
# Dataset Card for "rlhf-prompt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Coriolan/smart-contract-vulnerabilities | ---
license: mit
---
|
deepdoctection/FRFPE | ---
license: odc-by
task_categories:
- token-classification
language:
- de
- en
- fr
tags:
- finance
pretty_name: 'Funds report token classification '
size_categories:
- n<1K
---
**F**unds **R**eport **F**ront **P**age **E**ntities (FRFPE) is a dataset for document understanding and token classification.
It contains 356 titles/front pages of annual and semi-annual reports as well as extracted text and annotations for five different token categories.
FRFPE serves as an example of how to train and evaluate multimodal models such as LayoutLM using the deepdoctection framework on a custom dataset.
FRFPE contains documents in three different languages
- english: 167
- german: 149
- french: 9
as well as the token categories:
- report_date (1096 samples) - reporting date of the report
- report_type (738 samples) - annual/semi-annual report
- umbrella (912 samples) - fund issued as umbrella
- fund_name (2122 samples) - Subfund, as part of an umbrella fund or standalone fund
- other (12903 samples) - None of the above categories
The annotations have been made to the best of our knowledge and belief, but there is no claim on correctness.
Some cursory notes:
- The images were created by converting PDF files. A resolution of 300 dpi was applied during the conversion.
- The text was extracted from the PDF file using PDFPlumber. In some cases the PDF contains embedded images, which in turn contain text, such as corporate names. These are not extracted and are therefore not taken into account.
- The annotation was carried out with the annotation tool Prodigy.
- The category `report_date` is self-explanatory. `report_type` was used to indicate whether the report is an annual semi-annual report or a report in a different cycle.
- `umbrella`/`fund_name` is the classification of any token that is part of a fund name that represents either an umbrella, subfund or individual fund.
The distinction between whether a fund represents an umbrella, or single fund is not always apparent from the context of the document, which makes the classification
particularly challenging. In order to remain correct in the annotation, information from the Bafin database was used for cases that could not be clarified from the context.
To explore the dataset we suggest to use **deep**doctection. Place the unzipped folder in the `**deep**doctection ~/.cache/datasets` folder.
```python
import deepdoctection as dd
from pathlib import Path
@dd.object_types_registry.register("ner_first_page")
class FundsFirstPage(dd.ObjectTypes):
report_date = "report_date"
umbrella = "umbrella"
report_type = "report_type"
fund_name = "fund_name"
dd.update_all_types_dict()
path = Path("~/.cache/datasets/fund_ar_front_page/40952248ba13ae8bfdd39f56af22f7d9_0.json")
page = dd.Page.from_file(path)
page.image = dd.load_image_from_file(path.parents[0] / "image" / page.file_name.replace("pdf","png"))
page.viz(interactive=True,show_words=True) # close interactive window with q
for word in page.words:
print(f"text: {word.characters}, token class: {word.token_class}")
``` |
fivitee/oiyhn | ---
license: creativeml-openrail-m
---
|
linkaja/QnA-indo-election-2024 | ---
dataset_info:
features:
- name: content
dtype: string
- name: chat
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 82999290
num_examples: 13758
download_size: 31683268
dataset_size: 82999290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MattZid/hate_speech | ---
configs:
- config_name: default
data_files:
- split: train
path: "data/train/*.jsonl"
- split: validation
path: "data/val/*.jsonl"
- split: test
path: "data/test/*.jsonl"
--- |
SaeedMLK/seq2seq_ccmatrix_ar_en | ---
task_categories:
- translation
language:
- ar
- en
--- |
qgallouedec/prj_gia_dataset_metaworld_window_open_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the window-open-v2 environment, sample for the policy window-open-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_window_open_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_window_open_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
Falah/chapter6_0_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3832
num_examples: 15
download_size: 4044
dataset_size: 3832
---
# Dataset Card for "chapter6_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ronaldahmed/scitechnews | ---
task_categories:
- summarization
- text2text-generation
language:
- en
tags:
- science journalism
- style transfer
- text simplification
pretty_name: scitechnews
size_categories:
- 1K<n<10K
---
# Dataset Card for `scitechnews`
## Dataset Description
- **Repository:** [https://github.com/ronaldahmed/scitechnews]()
- **Paper:** [‘Don’t Get Too Technical with Me’: A Discourse Structure-Based Framework for Science Journalism]()
- **Point of Contact:** [Ronald Cardenas](mailto:ronald.cardenas@ed.ac.uk)
### Dataset Summary
The SciTechNews dataset consists of scientific papers paired with their corresponding
press release snippet mined from [ACM TechNews](https://technews.acm.org/).
ACM TechNews is a news aggregator that provides regular news digests about scientific achieve-
ments and technology in the areas of Computer Science, Engineering, Astrophysics, Biology, and
others.
### Supported Tasks and Leaderboards
This dataset was curated for the task of Science Journalism, a text-to-text task where the input is a scientific article and the output is a press release summary.
However, this release also include additional information of the press release and of the scientific article, such as
press release article body, title, authors' names and affiliations.
The science juornalism leaderboard is found [here]().
### Languages
English
## Dataset Structure
### Data Fields
```
{
"id": String, # unique ID
"pr-title": String, # Title as found in the ACMTECHNEWS website
"pr-article": String, # Press release article
"pr-summary": String, # Press release summary
"sc-title": String, # Title of scientific article
"sc-abstract": String, # Abstract of scientific article
"sc-article": String, # Concatenated abstract and sections of the scientific article
"sc-sections": List[String], # List of sections in the scientific article
"sc-section_names": List[String] # List of section names
"sc-authors": List[String] # list of authors' name and affiliations, in the format '<name> | <affil>'
}
```
Paragraphs in the press release articles (`pr-article`) and sections of the scientific article (`sc-sections`)
are separated by `\n`. Data is not sentence or word tokenized.<br>
Note that field `sc-article` includes the article's abstract as well as its sections.
### Example Instance
```
{
"id": 37,
"pr-title": "What's in a Developer's Name?",
"pr-article": "In one of the most memorable speeches from William Shakespeare's play, Romeo and Juliet , Juliet ponders, \" What's in a name? That which...",
"pr-summary": ""Researchers at the University of Waterloo's Cheriton School of Computer Science in Canada found a software developer's perceived race and ethnicity,...",
"sc-title": On the Relationship Between the Developer's Perceptible Race and Ethnicity and the Evaluation of Contributions in OSS",
"sc-abstract": "Context: Open Source Software (OSS) projects are typically the result of collective efforts performed by developers with different backgrounds...",
"sc-articles": "Context: Open Source Software (OSS) projects are typically the result of .... In any line of work, diversity regarding race, gender, personality...",
"sc-sections": ["In any line of work, diversity regarding race, gender, personality...","To what extent is the submitter's perceptible race and ethnicity related to...",...],
"sc-section_names": ["INTRODUCTION", "RQ1:", "RQ2:", "RELATED WORK",...],
"sc-authors": ["Reza Nadri | Cheriton School of Computer Science, University of Waterloo", "Gema Rodriguez Perez | Cheriton School of ...",...]
}
```
### Data Splits
Number of instances in train/valid/test are 26,368/1431/1000.<br>
Note that the training set has only press release data (`pr-*`), however
splits validation and test do have all fields.
## Dataset Creation
### Curation Rationale
*Science journalism* refers to producing journalistic content that covers topics related to different areas of scientific research. It plays an important role in fostering public understanding of science and its impact.
However, the sheer volume of scientific literature makes it challenging for journalists to report on every significant discovery, potentially leaving many overlooked.<br>
We construct a new open-access high-quality dataset for automatic science journalism that covers a wide range of scientific disciplines.
### Source Data
Press release snippets are mined from ACM TechNews and their respective scientific articles are mined from
reputed open-access journals and conference proceddings.
#### Initial Data Collection and Normalization
We collect archived TechNews snippets between 1999 and 2021 and link them with their respective press release articles.
Then, we parse each news article for links to the scientific article it reports about.
We discard samples where we find more than one link to scientific articles in the press release.
Finally, the scientific articles are retrieved in PDF format and processed using [Grobid](https://github.com/kermitt2/grobid).
Following collection strategies of previous scientific summarization datasets, section heading names are retrieved, and the article text is divided into sections. We also extract the title and all author names and affiliations.
#### Who are the source language producers?
All texts in this dataset (titles, summaries, and article bodies) were produced by humans.
## Considerations for Using the Data
### Social Impact of Dataset
The task of automatic science journalism is intended to support journalists or the researchers themselves in writing high-quality journalistic content more efficiently and coping with information overload.
For instance, a journalist could use the summaries generated by our systems as an initial draft and edit it for factual inconsistencies and add context if needed.
Although we do not foresee the negative societal impact of the task or the accompanying data itself, we point at the
general challenges related to factuality and bias in machine-generated texts, and call the potential users and developers of science journalism
applications to exert caution and follow up-to-date ethical policies.
## Additional Information
### Dataset Curators
- Ronald Cardenas, University of Edinburgh
- Bingsheng Yao, Rensselaer Polytechnic Institute
- Dakuo Wang, Northeastern University
- Yufang Hou, IBM Research Ireland
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@article{cardenas2023dont,
title={'Don't Get Too Technical with Me': A Discourse Structure-Based Framework for Science Journalism},
author={Ronald Cardenas and Bingsheng Yao and Dakuo Wang and Yufang Hou},
year={2023},
eprint={2310.15077},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Zombely/wikisource-green | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train_1
num_bytes: 15342818708.456
num_examples: 9816
- name: train_2
num_bytes: 13234327199.457
num_examples: 9997
- name: train_3
num_bytes: 8814747830.88
num_examples: 9935
- name: train_4
num_bytes: 10839226390.145
num_examples: 9995
- name: train_5
num_bytes: 12414635965.0
num_examples: 10000
- name: train_6
num_bytes: 5911580759.0
num_examples: 10000
- name: train_7
num_bytes: 11420080854.0
num_examples: 10000
- name: train_8
num_bytes: 18080629271.0
num_examples: 10000
- name: train_9
num_bytes: 11348011360.0
num_examples: 10000
- name: train_10
num_bytes: 14141957301.0
num_examples: 10000
- name: train_11
num_bytes: 9983910604.0
num_examples: 10000
- name: train_12
num_bytes: 13105253749.0
num_examples: 10000
- name: train_13
num_bytes: 15681320595.0
num_examples: 10000
- name: train_14
num_bytes: 14896725472.0
num_examples: 10000
- name: train_15
num_bytes: 11493364396.927
num_examples: 9987
- name: validation
num_bytes: 4487934740.612
num_examples: 4077
download_size: 5330245163
dataset_size: 191196525196.477
---
# Dataset Card for "wikisource-green"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
g4m3r/LS23 | ---
license: mit
---
|
fireworks-ai/msmarco_rank | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: train
num_bytes: 13976268494
num_examples: 398792
download_size: 7376188746
dataset_size: 13976268494
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "msmarco_rank"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_57 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1341548504.0
num_examples: 263462
download_size: 1365956903
dataset_size: 1341548504.0
---
# Dataset Card for "chunk_57"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_statistics-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 10793
num_examples: 5
- name: test
num_bytes: 2682018
num_examples: 216
download_size: 270672
dataset_size: 2692811
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_statistics-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alturing/gutenberg-texts | ---
dataset_info:
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 959018479
num_examples: 2951
download_size: 562052485
dataset_size: 959018479
---
# Dataset Card for "gutenberg-texts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zhouf23/slp-2002-2022-4degree | ---
license: mit
tags:
- climate
pretty_name: slp_toy
size_categories:
- 1K<n<10K
---
This dataset is a low-resolution sea-level-pressure field for testing the global weather model.
File information:
- Variable: sea-level-pressure (slp)
- Spatial resolution: 4 degree lat and lon
- Spatial dimension: 46(lat) * 90(lon)
- Temporal resolution: 1 day
- Number of snapshots: 7300 (approx 20 years)
- Size: 230 MB |
nryn21/int | ---
license: mit
---
|
QLM78910/funsd-zh | ---
dataset_info:
features:
- name: lang
dtype: string
- name: version
dtype: string
- name: split
dtype: string
- name: documents
list:
- name: id
dtype: string
- name: uid
dtype: string
- name: document
list:
- name: box
sequence: int64
- name: text
dtype: string
- name: label
dtype: string
- name: words
list:
- name: box
sequence: int64
- name: text
dtype: string
- name: linking
sequence:
sequence: int64
- name: id
dtype: int64
- name: img
struct:
- name: fname
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
splits:
- name: train
num_bytes: 4057416
num_examples: 1
- name: val
num_bytes: 1483956
num_examples: 1
download_size: 1269925
dataset_size: 5541372
---
# Dataset Card for "funsd-zh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akshayylr/skull_xray | ---
license: openrail
---
|
jk-gjom/covid19weibo | ---
license: afl-3.0
---
|
mask-distilled-libri-one-sec-cv12/chunk_0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: logits
sequence: float32
splits:
- name: train
num_bytes: 372046145.6693777
num_examples: 11605
download_size: 290578113
dataset_size: 372046145.6693777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tasksource/dynahate | ---
license: gpl
---
|
asi/wikitext_fr | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- fr
language_bcp47:
- fr-FR
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: Wikitext-fr
size_categories:
- unknown
source_datasets:
- original
task_categories:
- sequence-modeling
task_ids:
- language-modeling
---
# Dataset Card Creation Guide
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [https://github.com/AntoineSimoulin/gpt-fr](https://github.com/AntoineSimoulin/gpt-fr)
- **Paper:** [https://aclanthology.org/2021.jeptalnrecital-taln.24.pdf](https://aclanthology.org/2021.jeptalnrecital-taln.24.pdf)
### Dataset Summary
Wikitext-fr language modeling dataset consists of over 70 million tokens extracted from the set of french Wikipedia articles that are classified as "quality articles" or "good articles". It is designed to mirror the english benchmark from Stephen Merity, Caiming Xiong, James Bradbury, and Richard Socher. 2016.
[Pointer Sentinel Mixture Models](https://arxiv.org/abs/1609.07843) The dataset is available under the [Creative Commons Attribution-ShareAlike License](https://creativecommons.org/licenses/by-sa/4.0/)
### Supported Tasks and Leaderboards
- `language-modeling`: The dataset can be used to evaluate the generation abilites of a model. Success on this task is typically measured by achieving a *low* perplexity. The ([model name](https://huggingface.co/asi/gpt-fr-cased-base) currently achieves 12.9.
### Languages
The dataset is in French.
## Dataset Structure
### Data Instances
The dataset consists in the agregation of paragraphs from wikipedia articles.
```
{
'paragraph': ...,
...
}
```
### Data Fields
- `paragraph`: This is a paragraph from the original wikipedia article.
### Data Splits
The dataset is splited into a train/valid/test split.
| | Tain (35) | Train (72) | Valid | Test |
| ----- | ------ | ----- | ---- | ---- |
| Number of Documents | 2 126 | 5 902 | 60 | 60 |
| Number of tokens | 351 66 | 72 961 | 896 | 897 |
| Vocabulary size | 137 589 | 205 403 | | |
| Out of Vocabulary | 0.8% | 1.2% | | |
## Dataset Creation
### Curation Rationale
The dataset is created to evaluate French models with similart criteria than English.s
### Source Data
Wikitext-fr language modeling dataset consists of over 70 million tokens extracted from the set of french Wikipedia articles that are classified as "quality articles" or "good articles".
We did not apply specific pre-treatments as transformers models might use a dedicated tokenization.s
#### Initial Data Collection and Normalization
We used the Wikipedia API to collect the articles since cleaning Wikipedia articles from dumps is not a trivial task.
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
The dataset is available under the [Creative Commons Attribution-ShareAlike License](https://creativecommons.org/licenses/by-sa/4.0/)
### Citation Information
```
@inproceedings{simoulin:hal-03265900,
TITLE = {{Un mod{\`e}le Transformer G{\'e}n{\'e}ratif Pr{\'e}-entrain{\'e} pour le \_\_\_\_\_\_ fran{\c c}ais}},
AUTHOR = {Simoulin, Antoine and Crabb{\'e}, Benoit},
URL = {https://hal.archives-ouvertes.fr/hal-03265900},
BOOKTITLE = {{Traitement Automatique des Langues Naturelles}},
ADDRESS = {Lille, France},
EDITOR = {Denis, Pascal and Grabar, Natalia and Fraisse, Amel and Cardon, R{\'e}mi and Jacquemin, Bernard and Kergosien, Eric and Balvet, Antonio},
PUBLISHER = {{ATALA}},
PAGES = {246-255},
YEAR = {2021},
KEYWORDS = {fran{\c c}ais. ; GPT ; G{\'e}n{\'e}ratif ; Transformer ; Pr{\'e}-entra{\^i}n{\'e}},
PDF = {https://hal.archives-ouvertes.fr/hal-03265900/file/7.pdf},
HAL_ID = {hal-03265900},
HAL_VERSION = {v1},
}
```
### Contributions
Thanks to [@AntoineSimoulin](https://github.com/AntoineSimoulin) for adding this dataset. |
codesignal/wine-quality | ---
license: cc-by-4.0
language:
- en
pretty_name: Wine Quality
size_categories:
- 1K<n<10K
--- |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.