datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Joeman-Chen/text | ---
license: unknown
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-7000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 646489
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ravener/data | ---
license: mit
---
|
duongttr/SachGiaoKhoaOnline-raw | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: web-scraper-order
dtype: string
- name: web-scraper-start-url
dtype: string
- name: book_name
dtype: string
- name: book_name-href
dtype: string
- name: unit_name
dtype: string
- name: unit_name-href
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 32801343
num_examples: 2587
download_size: 8829746
dataset_size: 32801343
---
# Dataset Card for "SachGiaoKhoaOnline-raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BirdL/DallData | ---
annotations_creators: []
language: []
language_creators: []
license:
- other
multilinguality: []
pretty_name: DALL-E Latent Space Mapping
size_categories:
- 1K<n<10K
source_datasets: []
tags: []
task_categories:
- unconditional-image-generation
task_ids: []
---
DallData is a non-exhaustive look into DALL-E Mega(1)'s unconditional image generation. This is under the [BirdL-AirL License.](https://huggingface.co/spaces/BirdL/license/)
(1)
```bibtext
@misc{Dayma_DALL·E_Mini_2021,
author = {Dayma, Boris and Patil, Suraj and Cuenca, Pedro and Saifullah, Khalid and Abraham, Tanishq and Lê Khắc, Phúc and Melas, Luke and Ghosh, Ritobrata},
doi = {10.5281/zenodo.5146400},
month = {7},
title = {DALL·E Mini},
url = {https://github.com/borisdayma/dalle-mini},
year = {2021}
}
``` |
biglam/europeana_newspapers | ---
annotations_creators:
- no-annotation
language:
- de
- fr
- el
- et
- fi
- hr
- ji
- pl
- ru
- sr
- sv
- uk
language_creators:
- machine-generated
multilinguality:
- multilingual
pretty_name: 'Europeana Newspapers '
size_categories:
- 1M<n<10M
source_datasets: []
tags:
- newspapers
- lam
- OCR
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for Dataset Name
This dataset contains historic newspapers from [Europeana](https://pro.europeana.eu/page/iiif#download). In total the collection has ~32 Billion tokens. Documentation for this dataset is a WIP.
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
To download the full dataset using the `Datasets` library you can do the following
```python
from datasets import load_dataset
dataset = load_dataset("biglam/europeana_newspapers")
```
You can also access a subset based on language or decade ranges using the following function.
```python
from typing import List, Optional, Literal, Union
from huggingface_hub import hf_hub_url, list_repo_files
LanguageOption = Literal[
"et",
"pl",
"sr",
"ru",
"sv",
"no_language_found",
"ji",
"hr",
"el",
"uk",
"fr",
"fi",
"de",
"multi_language",
]
def get_files_for_lang_and_years(
languages: Union[None, List[LanguageOption]] = None,
min_year: Optional[int] = None,
max_year: Optional[int] = None,
):
files = list_repo_files("biglam/europeana_newspapers", repo_type="dataset")
parquet_files = [f for f in files if f.endswith(".parquet")]
parquet_files_filtered_for_lang = [
f for f in parquet_files if any(lang in f for lang in ["uk", "fr"])
]
filtered_files = [
f
for f in parquet_files
if (min_year is None or min_year <= int(f.split("-")[1].split(".")[0]))
and (max_year is None or int(f.split("-")[1].split(".")[0]) <= max_year)
]
return [
hf_hub_url("biglam/europeana_newspapers", f, repo_type="dataset")
for f in filtered_files
]
```
This function takes a list of language codes, and a min, max value for decades you want to include. You can can use this function to get the URLs for files you want to download from the Hub:
```python
ds = load_dataset("parquet", data_files=get_files_for_lang_and_years(['fr']), num_proc=4)
```
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
suolyer/pile_openwebtext2 | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-0a15404e-7594901 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: moghis/xlm-roberta-base-finetuned-panx-it
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.it
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: moghis/xlm-roberta-base-finetuned-panx-it
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
mystic-leung/medical_cord19 | ---
license: openrail
task_categories:
- summarization
language:
- aa
tags:
- medical
---
## Description
This dataset contains large amounts of biomedical abstracts and corresponding summaries. |
ThanHitt/MasuSalmonID | ---
license: unknown
---
|
DNNmodelmaker/College-data | ---
language:
- en
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1137867
num_examples: 1772
download_size: 285359
dataset_size: 1137867
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lukaemon/mmlu | ---
dataset_info:
- config_name: abstract_algebra
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 18616
num_examples: 100
- name: validation
num_bytes: 1935
num_examples: 11
- name: train
num_bytes: 783
num_examples: 5
download_size: 166184960
dataset_size: 21334
- config_name: anatomy
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 32164
num_examples: 135
- name: validation
num_bytes: 3030
num_examples: 14
- name: train
num_bytes: 920
num_examples: 5
download_size: 166184960
dataset_size: 36114
- config_name: astronomy
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 45695
num_examples: 152
- name: validation
num_bytes: 4903
num_examples: 16
- name: train
num_bytes: 2029
num_examples: 5
download_size: 166184960
dataset_size: 52627
- config_name: business_ethics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 32540
num_examples: 100
- name: validation
num_bytes: 2949
num_examples: 11
- name: train
num_bytes: 2143
num_examples: 5
download_size: 166184960
dataset_size: 37632
- config_name: clinical_knowledge
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 60887
num_examples: 265
- name: validation
num_bytes: 6449
num_examples: 29
- name: train
num_bytes: 1163
num_examples: 5
download_size: 166184960
dataset_size: 68499
- config_name: college_biology
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 47777
num_examples: 144
- name: validation
num_bytes: 4695
num_examples: 16
- name: train
num_bytes: 1485
num_examples: 5
download_size: 166184960
dataset_size: 53957
- config_name: college_chemistry
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 23996
num_examples: 100
- name: validation
num_bytes: 2260
num_examples: 8
- name: train
num_bytes: 1284
num_examples: 5
download_size: 166184960
dataset_size: 27540
- config_name: college_computer_science
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 41927
num_examples: 100
- name: validation
num_bytes: 4574
num_examples: 11
- name: train
num_bytes: 2718
num_examples: 5
download_size: 166184960
dataset_size: 49219
- config_name: college_mathematics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 23996
num_examples: 100
- name: validation
num_bytes: 2579
num_examples: 11
- name: train
num_bytes: 1446
num_examples: 5
download_size: 166184960
dataset_size: 28021
- config_name: college_medicine
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 81174
num_examples: 173
- name: validation
num_bytes: 7743
num_examples: 22
- name: train
num_bytes: 1623
num_examples: 5
download_size: 166184960
dataset_size: 90540
- config_name: college_physics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 29454
num_examples: 102
- name: validation
num_bytes: 3401
num_examples: 11
- name: train
num_bytes: 1365
num_examples: 5
download_size: 166184960
dataset_size: 34220
- config_name: computer_security
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 26412
num_examples: 100
- name: validation
num_bytes: 4460
num_examples: 11
- name: train
num_bytes: 1054
num_examples: 5
download_size: 166184960
dataset_size: 31926
- config_name: conceptual_physics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 39052
num_examples: 235
- name: validation
num_bytes: 4279
num_examples: 26
- name: train
num_bytes: 887
num_examples: 5
download_size: 166184960
dataset_size: 44218
- config_name: econometrics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 45737
num_examples: 114
- name: validation
num_bytes: 4871
num_examples: 12
- name: train
num_bytes: 1597
num_examples: 5
download_size: 166184960
dataset_size: 52205
- config_name: electrical_engineering
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 24111
num_examples: 145
- name: validation
num_bytes: 2778
num_examples: 16
- name: train
num_bytes: 925
num_examples: 5
download_size: 166184960
dataset_size: 27814
- config_name: elementary_mathematics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 67450
num_examples: 378
- name: validation
num_bytes: 8689
num_examples: 41
- name: train
num_bytes: 1393
num_examples: 5
download_size: 166184960
dataset_size: 77532
- config_name: formal_logic
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 48891
num_examples: 126
- name: validation
num_bytes: 6142
num_examples: 14
- name: train
num_bytes: 1710
num_examples: 5
download_size: 166184960
dataset_size: 56743
- config_name: global_facts
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 17691
num_examples: 100
- name: validation
num_bytes: 1783
num_examples: 10
- name: train
num_bytes: 1182
num_examples: 5
download_size: 166184960
dataset_size: 20656
- config_name: high_school_biology
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 107550
num_examples: 310
- name: validation
num_bytes: 10786
num_examples: 32
- name: train
num_bytes: 1626
num_examples: 5
download_size: 166184960
dataset_size: 119962
- config_name: high_school_chemistry
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 57031
num_examples: 203
- name: validation
num_bytes: 6926
num_examples: 22
- name: train
num_bytes: 1173
num_examples: 5
download_size: 166184960
dataset_size: 65130
- config_name: high_school_computer_science
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 43764
num_examples: 100
- name: validation
num_bytes: 3268
num_examples: 9
- name: train
num_bytes: 2871
num_examples: 5
download_size: 166184960
dataset_size: 49903
- config_name: high_school_european_history
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 269133
num_examples: 165
- name: validation
num_bytes: 29494
num_examples: 18
- name: train
num_bytes: 11517
num_examples: 5
download_size: 166184960
dataset_size: 310144
- config_name: high_school_geography
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 40636
num_examples: 198
- name: validation
num_bytes: 4166
num_examples: 22
- name: train
num_bytes: 1356
num_examples: 5
download_size: 166184960
dataset_size: 46158
- config_name: high_school_government_and_politics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 64711
num_examples: 193
- name: validation
num_bytes: 6904
num_examples: 21
- name: train
num_bytes: 1732
num_examples: 5
download_size: 166184960
dataset_size: 73347
- config_name: high_school_macroeconomics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 114945
num_examples: 390
- name: validation
num_bytes: 12707
num_examples: 43
- name: train
num_bytes: 1281
num_examples: 5
download_size: 166184960
dataset_size: 128933
- config_name: high_school_mathematics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 52952
num_examples: 270
- name: validation
num_bytes: 5550
num_examples: 29
- name: train
num_bytes: 1250
num_examples: 5
download_size: 166184960
dataset_size: 59752
- config_name: high_school_microeconomics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 74025
num_examples: 238
- name: validation
num_bytes: 7359
num_examples: 26
- name: train
num_bytes: 1251
num_examples: 5
download_size: 166184960
dataset_size: 82635
- config_name: high_school_physics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 58469
num_examples: 151
- name: validation
num_bytes: 6640
num_examples: 17
- name: train
num_bytes: 1442
num_examples: 5
download_size: 166184960
dataset_size: 66551
- config_name: high_school_psychology
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 155580
num_examples: 545
- name: validation
num_bytes: 16837
num_examples: 60
- name: train
num_bytes: 1858
num_examples: 5
download_size: 166184960
dataset_size: 174275
- config_name: high_school_statistics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 109178
num_examples: 216
- name: validation
num_bytes: 9824
num_examples: 23
- name: train
num_bytes: 2481
num_examples: 5
download_size: 166184960
dataset_size: 121483
- config_name: high_school_us_history
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 295294
num_examples: 204
- name: validation
num_bytes: 31540
num_examples: 22
- name: train
num_bytes: 8817
num_examples: 5
download_size: 166184960
dataset_size: 335651
- config_name: high_school_world_history
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 376946
num_examples: 237
- name: validation
num_bytes: 45307
num_examples: 26
- name: train
num_bytes: 4835
num_examples: 5
download_size: 166184960
dataset_size: 427088
- config_name: human_aging
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 44525
num_examples: 223
- name: validation
num_bytes: 4534
num_examples: 23
- name: train
num_bytes: 961
num_examples: 5
download_size: 166184960
dataset_size: 50020
- config_name: human_sexuality
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 31181
num_examples: 131
- name: validation
num_bytes: 2325
num_examples: 12
- name: train
num_bytes: 1030
num_examples: 5
download_size: 166184960
dataset_size: 34536
- config_name: international_law
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 52672
num_examples: 121
- name: validation
num_bytes: 6370
num_examples: 13
- name: train
num_bytes: 2371
num_examples: 5
download_size: 166184960
dataset_size: 61413
- config_name: jurisprudence
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 33218
num_examples: 108
- name: validation
num_bytes: 3640
num_examples: 11
- name: train
num_bytes: 1256
num_examples: 5
download_size: 166184960
dataset_size: 38114
- config_name: logical_fallacies
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 48964
num_examples: 163
- name: validation
num_bytes: 4965
num_examples: 18
- name: train
num_bytes: 1526
num_examples: 5
download_size: 166184960
dataset_size: 55455
- config_name: machine_learning
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 33084
num_examples: 112
- name: validation
num_bytes: 3143
num_examples: 11
- name: train
num_bytes: 2276
num_examples: 5
download_size: 166184960
dataset_size: 38503
- config_name: management
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 19269
num_examples: 103
- name: validation
num_bytes: 1731
num_examples: 11
- name: train
num_bytes: 851
num_examples: 5
download_size: 166184960
dataset_size: 21851
- config_name: marketing
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 61375
num_examples: 234
- name: validation
num_bytes: 7207
num_examples: 25
- name: train
num_bytes: 1434
num_examples: 5
download_size: 166184960
dataset_size: 70016
- config_name: medical_genetics
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 20152
num_examples: 100
- name: validation
num_bytes: 2916
num_examples: 11
- name: train
num_bytes: 1042
num_examples: 5
download_size: 166184960
dataset_size: 24110
- config_name: miscellaneous
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 142211
num_examples: 783
- name: validation
num_bytes: 13716
num_examples: 86
- name: train
num_bytes: 652
num_examples: 5
download_size: 166184960
dataset_size: 156579
- config_name: moral_disputes
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 105384
num_examples: 346
- name: validation
num_bytes: 12142
num_examples: 38
- name: train
num_bytes: 1708
num_examples: 5
download_size: 166184960
dataset_size: 119234
- config_name: moral_scenarios
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 367749
num_examples: 895
- name: validation
num_bytes: 41626
num_examples: 100
- name: train
num_bytes: 2011
num_examples: 5
download_size: 166184960
dataset_size: 411386
- config_name: nutrition
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 90256
num_examples: 306
- name: validation
num_bytes: 8193
num_examples: 33
- name: train
num_bytes: 2038
num_examples: 5
download_size: 166184960
dataset_size: 100487
- config_name: philosophy
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 77884
num_examples: 311
- name: validation
num_bytes: 8934
num_examples: 34
- name: train
num_bytes: 941
num_examples: 5
download_size: 166184960
dataset_size: 87759
- config_name: prehistory
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 87314
num_examples: 324
- name: validation
num_bytes: 10028
num_examples: 35
- name: train
num_bytes: 1831
num_examples: 5
download_size: 166184960
dataset_size: 99173
- config_name: professional_accounting
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 122564
num_examples: 282
- name: validation
num_bytes: 14143
num_examples: 31
- name: train
num_bytes: 2101
num_examples: 5
download_size: 166184960
dataset_size: 138808
- config_name: professional_law
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 1881012
num_examples: 1534
- name: validation
num_bytes: 202317
num_examples: 170
- name: train
num_bytes: 6563
num_examples: 5
download_size: 166184960
dataset_size: 2089892
- config_name: professional_medicine
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 215645
num_examples: 272
- name: validation
num_bytes: 23618
num_examples: 31
- name: train
num_bytes: 3760
num_examples: 5
download_size: 166184960
dataset_size: 243023
- config_name: professional_psychology
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 221603
num_examples: 612
- name: validation
num_bytes: 28606
num_examples: 69
- name: train
num_bytes: 2220
num_examples: 5
download_size: 166184960
dataset_size: 252429
- config_name: public_relations
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 27978
num_examples: 110
- name: validation
num_bytes: 4470
num_examples: 12
- name: train
num_bytes: 1449
num_examples: 5
download_size: 166184960
dataset_size: 33897
- config_name: security_studies
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 203117
num_examples: 245
- name: validation
num_bytes: 22436
num_examples: 27
- name: train
num_bytes: 5288
num_examples: 5
download_size: 166184960
dataset_size: 230841
- config_name: sociology
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 64824
num_examples: 201
- name: validation
num_bytes: 7018
num_examples: 22
- name: train
num_bytes: 1566
num_examples: 5
download_size: 166184960
dataset_size: 73408
- config_name: us_foreign_policy
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 27731
num_examples: 100
- name: validation
num_bytes: 3175
num_examples: 11
- name: train
num_bytes: 1564
num_examples: 5
download_size: 166184960
dataset_size: 32470
- config_name: virology
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 37585
num_examples: 166
- name: validation
num_bytes: 5325
num_examples: 18
- name: train
num_bytes: 1049
num_examples: 5
download_size: 166184960
dataset_size: 43959
- config_name: world_religions
features:
- name: input
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 24065
num_examples: 171
- name: validation
num_bytes: 2620
num_examples: 19
- name: train
num_bytes: 623
num_examples: 5
download_size: 166184960
dataset_size: 27308
---
# MMLU dataset
Measuring Massive Multitask Language Understanding: https://github.com/hendrycks/test
task_list = [
"high_school_european_history",
"business_ethics",
"clinical_knowledge",
"medical_genetics",
"high_school_us_history",
"high_school_physics",
"high_school_world_history",
"virology",
"high_school_microeconomics",
"econometrics",
"college_computer_science",
"high_school_biology",
"abstract_algebra",
"professional_accounting",
"philosophy",
"professional_medicine",
"nutrition",
"global_facts",
"machine_learning",
"security_studies",
"public_relations",
"professional_psychology",
"prehistory",
"anatomy",
"human_sexuality",
"college_medicine",
"high_school_government_and_politics",
"college_chemistry",
"logical_fallacies",
"high_school_geography",
"elementary_mathematics",
"human_aging",
"college_mathematics",
"high_school_psychology",
"formal_logic",
"high_school_statistics",
"international_law",
"high_school_mathematics",
"high_school_computer_science",
"conceptual_physics",
"miscellaneous",
"high_school_chemistry",
"marketing",
"professional_law",
"management",
"college_physics",
"jurisprudence",
"world_religions",
"sociology",
"us_foreign_policy",
"high_school_macroeconomics",
"computer_security",
"moral_scenarios",
"moral_disputes",
"electrical_engineering",
"astronomy",
"college_biology",
]
```
@article{hendryckstest2021,
title={Measuring Massive Multitask Language Understanding},
author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt},
journal={Proceedings of the International Conference on Learning Representations (ICLR)},
year={2021}
}
``` |
Mitsuki-Sakamoto/fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.3_seed_2_t_1.0_eval | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
- name: gen_proxy_reward
dtype: float64
- name: gen_gold_reward
dtype: float64
splits:
- name: epoch_0
num_bytes: 44053127
num_examples: 18928
- name: epoch_1
num_bytes: 44687906
num_examples: 18928
- name: epoch_2
num_bytes: 44753790
num_examples: 18928
- name: epoch_3
num_bytes: 44801366
num_examples: 18928
- name: epoch_4
num_bytes: 44808520
num_examples: 18928
- name: epoch_5
num_bytes: 44808580
num_examples: 18928
- name: epoch_6
num_bytes: 44797472
num_examples: 18928
- name: epoch_7
num_bytes: 44784440
num_examples: 18928
- name: epoch_8
num_bytes: 44773881
num_examples: 18928
- name: epoch_9
num_bytes: 44772981
num_examples: 18928
- name: epoch_10
num_bytes: 44771784
num_examples: 18928
- name: epoch_11
num_bytes: 44769676
num_examples: 18928
- name: epoch_12
num_bytes: 44769433
num_examples: 18928
- name: epoch_13
num_bytes: 44768073
num_examples: 18928
- name: epoch_14
num_bytes: 44770016
num_examples: 18928
- name: epoch_15
num_bytes: 44766277
num_examples: 18928
- name: epoch_16
num_bytes: 44769701
num_examples: 18928
- name: epoch_17
num_bytes: 44768338
num_examples: 18928
- name: epoch_18
num_bytes: 44767659
num_examples: 18928
- name: epoch_19
num_bytes: 44768923
num_examples: 18928
- name: epoch_20
num_bytes: 44769244
num_examples: 18928
- name: epoch_21
num_bytes: 44767824
num_examples: 18928
- name: epoch_22
num_bytes: 44769134
num_examples: 18928
- name: epoch_23
num_bytes: 44768174
num_examples: 18928
- name: epoch_24
num_bytes: 44769890
num_examples: 18928
- name: epoch_25
num_bytes: 44769962
num_examples: 18928
- name: epoch_26
num_bytes: 44768531
num_examples: 18928
- name: epoch_27
num_bytes: 44767841
num_examples: 18928
- name: epoch_28
num_bytes: 44768291
num_examples: 18928
- name: epoch_29
num_bytes: 44767591
num_examples: 18928
download_size: 710269085
dataset_size: 1342418425
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
# Dataset Card for "fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.3_seed_2_t_1.0_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huhlim/pdb.29k | ---
license: mit
---
|
shidowake/glaive-code-assistant-v1-sharegpt-format_split_18 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 10503837.603832223
num_examples: 6805
download_size: 5129494
dataset_size: 10503837.603832223
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
katarinayuan/ProtST-GeneOntology-BP | ---
configs:
- config_name: default
data_files:
- split: train
path: gene_ontology_bp_train.csv
- split: validation
path: gene_ontology_bp_valid.csv
- split: test
path: gene_ontology_bp_test.csv
--- |
mmcho1157/attackgpt_base | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 16440
num_examples: 70
download_size: 2433
dataset_size: 16440
---
# Dataset Card for "attackgpt_base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nicholasKluge/reward-aira-dataset | ---
language:
- pt
- en
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- text-classification
pretty_name: Reward-Aira Dataset
tags:
- reward model
- instruction
- alignment
dataset_info:
features:
- name: instruction
dtype: string
- name: chosen_response
dtype: string
- name: rejected_response
dtype: string
splits:
- name: portuguese
num_bytes: 129936139
num_examples: 35000
- name: english
num_bytes: 119053415
num_examples: 35000
download_size: 141137566
dataset_size: 248989554
configs:
- config_name: default
data_files:
- split: portuguese
path: data/portuguese-*
- split: english
path: data/english-*
---
# Reward-Aira Dataset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/Nkluge-correa/Aira
- **Point of Contact:** [AIRES at PUCRS](nicholas@airespucrs.org)
### Dataset Summary
This dataset contains a collection of prompt + completion examples of LLM following instructions in a conversational manner. All prompts come with two possible completions (one better than the other). The dataset is available in both Portuguese and English.
### Supported Tasks and Leaderboards
This dataset can be utilized to train a reward/preference model or DPO fine-tuning.
### Languages
English and Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- **instruction:** The initial prompt provided to the model.
- **chosen_response:** A completion to the prompt.
- **rejected_response:** A worst completion to the prompt.
### Data Fields
```python
{
"instruction": "Why is AI Ethics important?",
"chosen_response": "The field of AI Ethics delves deeply into the intricate ethical considerations that arise with respect to AI systems. This includes the role of humanity in creating and deploying these systems, as well as the conduct of machines themselves. Broadly speaking, AI Ethics can be divided into two major categories : concerns surrounding the morality of human actions in relation to creating and using AI, and concerns regarding the moral implications of machine behavior.",
"rejected_response": "Who cares about AI Ethics? It's just a bunch of whining about humans making and using AI and bitching about what the machines do."
}
```
### Data Splits
Available splits are `english` and `portuguese`.
```python
from datasets import load_dataset
dataset = load_dataset("nicholasKluge/reward-aira-dataset", split="portuguese")
```
## Dataset Creation
### Curation Rationale
This dataset was developed are part of [Nicholas Kluge's](https://nkluge-correa.github.io/) doctoral dissertation, "_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.
### Source Data
#### Initial Data Collection and Normalization
This dataset contains a collection of prompt + completion examples of LLM following instructions in a conversational manner. All prompts come with two possible completions (one better than the other). These completions were ranked using the [OpenAssistant/reward-model-deberta-v3-large-v2](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2).
#### Who are the source language producers?
Mainly English. The Portuguese version was achieved by translating the English version via the Google Translator API.
### Annotations
#### Annotation process
Completions were ranked using the [OpenAssistant/reward-model-deberta-v3-large-v2](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2).
#### Who are the annotators?
[Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org).
### Personal and Sensitive Information
No personal or sensitive information is part of this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
No considerations.
### Discussion of Biases
No considerations.
### Other Known Limitations
No considerations.
## Additional Information
### Dataset Curators
[Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org).
### Licensing Information
This dataset is licensed under the [Apache License, version 2.0](LICENSE).
### Citation Information
```latex
@misc{nicholas22aira,
doi = {10.5281/zenodo.6989727},
url = {https://github.com/Nkluge-correa/Aira},
author = {Nicholas Kluge Corrêa},
title = {Aira},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
}
```
### Contributions
If you would like to contribute, contact me at [nicholas@airespucrs.org](mailto:nicholas@airespucrs.org)!
|
distilled-from-one-sec-cv12/chunk_44 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1251551104
num_examples: 243872
download_size: 1272561308
dataset_size: 1251551104
---
# Dataset Card for "chunk_44"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_30 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 109589333
num_examples: 10973
download_size: 32115454
dataset_size: 109589333
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_30"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
night12/authorTextIdentification | ---
license: mit
language:
- en
pretty_name: author identification blogs 50 dataset
--- |
alanahmet/HealthAssistant115 | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- medical
- biology
pretty_name: Health Assistant
size_categories:
- n<1K
---
This dataset created for experiment to fine-tune LLM model. Questions created by ChatGPT as answer to "Give me questions a person can ask about for healthy life".
Answers created by OpenAI API. |
SciPhi/textbooks-are-all-you-need-lite | ---
dataset_info:
features:
- name: formatted_prompt
dtype: string
- name: completion
dtype: string
- name: first_task
dtype: string
- name: second_task
dtype: string
- name: last_task
dtype: string
- name: notes
dtype: string
- name: title
dtype: string
- name: model
dtype: string
- name: temperature
dtype: float64
splits:
- name: train
num_bytes: 3175095649
num_examples: 681845
download_size: 1280399468
dataset_size: 3175095649
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: llama2
---
## Textbooks are all you need : A SciPhi Collection
Dataset Description
With LLMs, we can create a fully open-source Library of Alexandria.
As a first attempt, we have generated 650,000 unique textbook samples from a diverse span of courses, kindergarten through graduate school.
These are open source samples, which likely fall under the Llama-2 license. They were generated using the [SciPhi](https://github.com/emrgnt-cmplxty/SciPhi) repository.
All samples were created with [TheBloke/Phind-CodeLlama-34B-v2-AWQ](https://huggingface.co/TheBloke/Phind-CodeLlama-34B-v2-AWQ).
Lastly, I owe thanks to Runpod for the generous GPU time to make this possible. |
Azaadi123/Azaadi | ---
license: apache-2.0
---
|
fathyshalab/clinic-banking | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 21001.221333333335
num_examples: 262
- name: test
num_bytes: 9057.778666666667
num_examples: 113
download_size: 16289
dataset_size: 30059.0
---
# Dataset Card for "clinic-banking"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ggul-tiger/negobot_simple_weak_datas | ---
dataset_info:
features:
- name: events
list:
- name: message
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 20369
num_examples: 177
download_size: 3636
dataset_size: 20369
---
# Dataset Card for "negobot_simple_weak_datas"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OGOFML/test_embeddings_medicare | ---
license: unlicense
--- |
LionEnergy/solar-data | ---
license: mit
---
|
BangumiBase/genjitsushugiyuushanooukokusaikenki | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Genjitsu Shugi Yuusha No Oukoku Saikenki
This is the image base of bangumi Genjitsu Shugi Yuusha no Oukoku Saikenki, we detected 62 characters, 5514 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 117 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 35 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 1420 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 33 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 81 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 25 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 45 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 111 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 128 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 23 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 96 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 52 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 79 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 19 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 26 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 39 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 97 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 110 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 18 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 14 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 13 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 17 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 29 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 20 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 306 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 8 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 18 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 52 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 16 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 34 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 153 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 10 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 13 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 12 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 13 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 22 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 45 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 135 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 12 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 19 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 47 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 107 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 24 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 287 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 19 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 393 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 50 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 11 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 91 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 73 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 102 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 51 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 61 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 15 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 74 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 174 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 33 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 78 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 20 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 90 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 7 | [Download](60/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 192 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
DinoTheLewis/GSM8K_ko | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 10434734
num_examples: 7259
- name: test
num_bytes: 1902238
num_examples: 1291
download_size: 5589993
dataset_size: 12336972
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
victorasso/test | ---
license: mit
---
|
agomberto/FrenchCensus-handwritten-texts | ---
language:
- fr
license: mit
size_categories:
- 1K<n<10K
task_categories:
- image-to-text
tags:
- imate-to-text
- trocr
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 501750699.816
num_examples: 5601
- name: validation
num_bytes: 45084242.0
num_examples: 707
- name: test
num_bytes: 49133043.0
num_examples: 734
download_size: 459795745
dataset_size: 595967984.816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
## Source
This repository contains 3 datasets created within the POPP project ([Project for the Oceration of the Paris Population Census](https://popp.hypotheses.org/#ancre2)) for the task of handwriting text recognition. These datasets have been published in [Recognition and information extraction in historical handwritten tables: toward understanding early 20th century Paris census at DAS 2022](https://link.springer.com/chapter/10.1007/978-3-031-06555-2_10).
The 3 datasets are called “Generic dataset”, “Belleville”, and “Chaussée d’Antin” and contains lines made from the extracted rows of census tables from 1926. Each table in the Paris census contains 30 rows, thus each page in these datasets corresponds to 30 lines.
We publish here only the lines. If you want the pages, go [here](https://zenodo.org/record/6581158). This dataset is made 4800 annotated lines extracted from 80 double pages of the 1926 Paris census.
## Data Info
Since the lines are extracted from table rows, we defined 4 special characters to describe the structure of the text:
- ¤ : indicates an empty cell
- / : indicates the separation into columns
- ? : indicates that the content of the cell following this symbol is written above the regular baseline
- ! : indicates that the content of the cell following this symbol is written below the regular baseline
There are three splits: train, valid and test.
## How to use it
```python
from datasets import load_dataset
import numpy as np
dataset = load_dataset("agomberto/FrenchCensus-handwritten-texts")
i = np.random.randint(len(dataset['train']))
img = dataset['train']['image'][i]
text = dataset['train']['text'][i]
print(text)
img
```
## BibTeX entry and citation info
```bibtex
@InProceedings{10.1007/978-3-031-06555-2_10,
author="Constum, Thomas
and Kempf, Nicolas
and Paquet, Thierry
and Tranouez, Pierrick
and Chatelain, Cl{\'e}ment
and Br{\'e}e, Sandra
and Merveille, Fran{\c{c}}ois",
editor="Uchida, Seiichi
and Barney, Elisa
and Eglin, V{\'e}ronique",
title="Recognition and Information Extraction in Historical Handwritten Tables: Toward Understanding Early {\$}{\$}20^{\{}th{\}}{\$}{\$}Century Paris Census",
booktitle="Document Analysis Systems",
year="2022",
publisher="Springer International Publishing",
address="Cham",
pages="143--157",
abstract="We aim to build a vast database (up to 9 million individuals) from the handwritten tabular nominal census of Paris of 1926, 1931 and 1936, each composed of about 100,000 handwritten simple pages in a tabular format. We created a complete pipeline that goes from the scan of double pages to text prediction while minimizing the need for segmentation labels. We describe how weighted finite state transducers, writer specialization and self-training further improved our results. We also introduce through this communication two annotated datasets for handwriting recognition that are now publicly available, and an open-source toolkit to apply WFST on CTC lattices.",
isbn="978-3-031-06555-2"
}
``` |
sudipto-ducs/inllegalllama-data | ---
license: apache-2.0
dataset_info:
features:
- name: source
dtype: string
- name: doc_id
dtype: string
- name: type
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2562990584
num_examples: 63137
download_size: 916371045
dataset_size: 2562990584
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mekaneeky/Synthetic_Luganda_VITS_22.5k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: eng
dtype: string
- name: lug
dtype: string
- name: ach
dtype: string
- name: teo
dtype: string
- name: lgg
dtype: string
- name: nyn
dtype: string
- name: luganda_synthetic_audio
sequence:
sequence: float32
splits:
- name: train
num_bytes: 7285635296
num_examples: 23947
- name: dev
num_bytes: 152275373
num_examples: 500
- name: test
num_bytes: 152693840
num_examples: 500
download_size: 7608350318
dataset_size: 7590604509
---
# Dataset Card for "Synthetic_Luganda_VITS_22.5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
varshil27/1mg-train-data-LLama2-formatted | ---
license: mit
---
|
marcosfevre/images | ---
license: cc-by-4.0
---
|
Kaludi/data-reviews-sentiment-analysis | ---
language:
- en
task_categories:
- text-classification
---
# Dataset for the project: reviews-sentiment-analysis
## Dataset Description
This dataset is for project reviews-sentiment-analysis.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Now, I won't deny that when I purchased this off eBay, I had high expectations. This was an incredible out-of-print work from the master of comedy that I so enjoy. However, I was soon to be disappointed. Apologies to those who enjoyed it, but I just found the Compleat Al to be very difficult to watch. I got a few smiles, sure, but the majority of the funny came from the music videos (which I've got on DVD) and the rest was basically filler. You could tell that this was not Al's greatest video achievement (that honor goes to UHF). Honestly, I doubt if this will ever make the jump to DVD, so if you're an ultra-hardcore Al fan and just HAVE to own everything, buy the tape off eBay. Just don't pay too much for it.",
"target": 0
},
{
"text": "The saddest thing about this \"tribute\" is that almost all the singers (including the otherwise incredibly talented Nick Cave) seem to have missed the whole point where Cohen's intensity lies: by delivering his lines in an almost tuneless poise, Cohen transmits the full extent of his poetry, his irony, his all-round humanity, laughter and tears in one.<br /><br />To see some of these singer upstarts make convoluted suffering faces, launch their pathetic squeals in the patent effort to scream \"I'm a singer!,\" is a true pain. It's the same feeling many of you probably had listening in to some horrendous operatic versions of simple songs such as Lennon's \"Imagine.\" Nothing, simply nothing gets close to the simplicity and directness of the original. If there is a form of art that doesn't need embellishments, it's Cohen's art. Embellishments cast it in the street looking like the tasteless make-up of sex for sale.<br /><br />In this Cohen's tribute I found myself suffering and suffering through pitiful tributes and awful reinterpretations, all of them entirely lacking the original irony of the master and, if truth be told, several of these singers sounded as if they had been recruited at some asylum talent show. It's Cohen doing a tribute to them by letting them sing his material, really, not the other way around: they may have been friends, or his daughter's, he could have become very tender-hearted and in the mood for a gift. Too bad it didn't stay in the family.<br /><br />Fortunately, but only at the very end, Cohen himself performed his majestic \"Tower of Song,\" but even that flower was spoiled by the totally incongruous background of the U2, all of them carrying the expression that bored kids have when they visit their poor grandpa at the nursing home.<br /><br />A sad show, really, and sadder if you truly love Cohen as I do.",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['Negative', 'Positive'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follows:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 7499 |
| valid | 2497 |
|
Coooori/dialog_data_dev_hf | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 166788
num_examples: 99
download_size: 0
dataset_size: 166788
---
# Dataset Card for "dialog_data_dev_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JeremyAlain/SLF5K | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license: apache-2.0
multilinguality:
- monolingual
pretty_name: SLF5K
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- feedback
- human feedback
- language feedback
- binary feedback
- reward
- reward model
- gpt3
- gpt-3
- instructgpt
- alignment
- ai alignment
- scale
- imitation learning from language feedback
- ilf
task_categories:
- summarization
task_ids: []
---
# Dataset Card for SLF5K
## Dataset Description
- **Repository: https://github.com/JeremyAlain/imitation_learning_from_language_feedback**
- **Paper: Training Language Models with Language Feedback at Scale**
- **Point of Contact: jeremy.scheurer@nyu.edu and ethan@anthropic.com**
### Dataset Summary
The Summarization with Language Feedback (SLF5K) dataset is an English-language dataset containing 5K unique samples that can be used
for the task of abstraction summarization. Each sample consists
of a Reddit title and post, a model-generated ([FeedME](https://beta.openai.com/docs/model-index-for-researchers)) summary, and human-written language feedback on that summary.
Additionally, each sample has a high-quality, human-written (gold) summary that should be ideal for the Reddit post.
Lastly, each sample has two additional model-generated summaries with binary human preference labels, on which summary is preferred by a human.
The dataset can be used to train language models with language feedback on abstractive summarization. It can also be
used to train a reward model on binary preferences.
The Reddit posts were taken from the datasets provided by [Learning to Summarize from Human Feedbback](https://arxiv.org/pdf/2009.01325.pdf), who used the initial Reddit post dataset
[TL;DR: Mining Reddit to Learn Automatic Summarization](https://aclanthology.org/W17-4508.pdf).
### Supported Tasks and Leaderboards
The dataset can be used to train a model for abstractive and extractive summarization. It can either be trained directly on
human-written summaries, or leverage language feedback or binary human preferences.
The model performance is evaluated in a human evaluation, where annotators rate the quality of the generated summaries.
Previous work has used [ROUGE](https://huggingface.co/spaces/evaluate-metric/rouge) scores, but in [Learning to Summarize from Human Feedbback](https://arxiv.org/pdf/2009.01325.pdf) they
show that ROUGE is not an ideal metric.
### Languages
English
## Dataset Structure
### Data Instances
Each instance is a line in the dataset file (which is saved as .jsonl). Each instance contains various fields, where the most important are
Here is an example instance:
```
{"id":"t3_3w7gyp",
"subreddit":"dogs",
"title":"Puppy playing at park - other owner aggressive towards him [help]",
"post":"Hi all, looking for some advice. I have a 6m old kelpie, buzz, who goes with me daily to a dog park, [...]",
"tldr_human_reference_summary":"other owner at park harsh with my dog for playing to rough with his. Have tried talking to him about it, hasn't helped.",
"summary_prompt":"Write an excellent summary of the given text.\n\nTitle: Puppy playing at park - other owner aggressive towards him [help]\n\nText: Hi all, looking for some advice. [...] that too.\n\nTL;DR:",
"generated_summary_for_comparison_A":"New dog at park is being aggressive to my pup, owner won't stop. What do I do?",
"generated_summary_for_comparison_B":"A new dog has been coming to the dog park and the first day the new dog came, the old dog (a kelpie) was all over him.",
"generated_summary_for_feedback":"A new dog has been coming to the dog park and the first day the owner hauled buzz off and whacked him. Today, the owner was staring daggers at me and lunging at buzz\/pulling his collar roughly.",
"comparison_preference":"Summary A",
"feedback":"The summary is concise but could include information about the poster knowing the dogs are just playing and will react if they become aggressive and wants to know how to handle things with Max's dad. ",
"feedback_class":"Coverage",
"has_additional_feedback":"No",
"ideal_human_summary":"The poster is frustrated with a new person at the dog park who is upset with him because their young dogs are playing roughly. The poster will step in if it gets aggressive and wants the new person to understand this. "}
```
There are some additional fields like `time_spent_in_seconds_ideal_human_summary`, `time_spent_in_seconds_feedback`,`time_spent_in_seconds_comparison` which only have values for the development dataset.
### Data Fields
- `id`: a unique string identifying the reddit post.
- `subreddit`: subreddit of the post.
- `title`: title of the reddit post.
- `post`: reddit post
- `tldr_human_reference_summary`: human reference summary automatically extracted from reddit (taken from the dataset of [TL;DR: Mining Reddit to Learn Automatic Summarization](https://aclanthology.org/W17-4508.pdf))
- `summary_prompt`: the whole prompt used to generate summaries
- `generated_summary_for_comparison_A`: summary A used for binary human comparison (generated with FeedME)
- `generated_summary_for_comparison_B`: summary B used for binary human comparison (generated with FeedME)
- `generated_summary_for_feedback`: summary used to gather human language feedback ((generated with FeedME))
- `comparison_preference`: prefered Summary of human comparison, Values: "Summary A", "Summary B"
- `feedback`: human language feedback on `generated_summary_for_feedback`(most important feedback point)
- `feedback_class`: Class of language feedback, Values: "Coverage", "Accuracy", "Coherence", "other"
- `has_additional_feedback`: Whether this sample could use more feedback on an important point.
- `ideal_human_summary`: high-quality human-written summary for this sample. We instructed annotators to write an ideal summary.
- `time_spent_in_seconds_ideal_human_summary`: Annotation time for ideal human summary
- `time_spent_in_seconds_feedback`: Annotation time for language feedback
- `time_spent_in_seconds_comparison`: Annotation time for binary comparison
Note that the various datasplits have varying fields. The fields that are not contained in a dataset have the value None.
### Data Splits
The SLF5K dataset has 4 splits: _train_, _development_, _validation_, and _test_. Below are the statistics of the dataset.
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 5000 |
| Development | 200 |
| Validation | 500 |
| Test | 698 |
The reason we introduce a development and validation dataset, is the following.
## Dataset Creation
### Curation Rationale
This dataset aims to support supervised language model training from human preferences on a summarization task with real natural training data.
### Source Data
#### Initial Data Collection and Normalization
The initial TL;DR dataset was made public by Völkse et. al. in the paper [TL;DR: Mining Reddit to Learn Automatic Summarization](https://aclanthology.org/W17-4508.pdf) (licensed under CC By 4.0).
Stiennon et. al. then use this TL;DR dataset for their work [Learning to Summarize from Human Feedbback](https://arxiv.org/pdf/2009.01325.pdf).
They filter the TL;DR dataset for quality reasons and collect binary human preference labels.
Our datset is a subset from Stiennon et. al. Dataset, which can be downloaded [here](https://github.com/openai/summarize-from-feedback).
Our train and development dataset are taken form their train dataset and our test and validation datasets are taken from their test datasest.
#### Who are the source language producers?
The reddit posts are written by users of reddit.com.
### Annotations
#### Annotation process
We first onboarded annotators by giving them test tasks on which we evaluated their annotation quality. We then selected 31
annotators for the remainder of the project (a few were removed later on due to quality issues). Througout the process
we updated our instructions to make the tasks clearer and stayed in close contact with the annotators to answer questions etc.
The various dataset splits were collected in multiple annotation iterations. The largest annotation was a single iteration of annotation
5000 samples for the train dataset.
#### Who are the annotators?
We used annotators through the annotation service [Surge AI](https://www.surgehq.ai/).
### Personal and Sensitive Information
The annotators were completely anonymized and no information about them can be found in the dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset is to align language models with human preferences by leveraging language feedback, on the task of summarization. Concretely, the goal is to
to develop models that produce summaries for reddit posts that are more in line with human preferences.
Note that this does not imply that the outputs will perfectly be aligned with human values, i.e. outputs can still be misaligned, offensive and contain harumful biases.
While outputs from a model trained on our dataset may reflect the language of the reddit posts, summaries, and human feedback, it should always be made clear that such an output
is automatically generated.
### Discussion of Biases
The TL;DR dataset consists of user-submitted posts to the website reddit.com. It can thus contain content that is offensive or reflects harmful social biases.
We thus recommend that models trained on the SLF5K dataset (which is based on the TL;DR) dataset be thoroughly studied for potential harmful behavior.
The human preferences and feedback represented in this dataset were collected through crowd-workers and may disproportionally represent the views, biases, and values
of the respective demographic of the annotators.
### Other Known Limitations
The "human-summaries" collected in the TL;DR dataset (and available in the SLF5K dataset under the field `tldr_human_reference_summary`, were automatically extracted from reddit.com.
They are often of poor quality and do not accurately reflect human summarization performance. In our paper, we show that our human written summaries (available in the SLF5K dataset under the field
`ideal_human_summary`) are of much higher quality.
## Additional Information
### Dataset Curators
The data is collected by Jérémy Scheurer, Jon Ander Campos, Tomasz Korbak, Jun Shern Chan, Angelica Chen, Kyunghyun Cho, and Ethan Perez.
All authors are affiliated with New York University. Additionally, Jérémy Scheurer is affiliated with FAR AI. Jon Ander
is affiliated with the University of the Basque Country. Tomek Korbak is affiliated with FAR AI and the University of Sussesx.
Kyunghyun Cho is affiliated with Genentech and CIFAR LMB. Ethan Perez is affiliated with FAR AI and Anthropic.
### Licensing Information
The SLF5K dataset is released under the Apache 2.0 license.
### Citation Information
TBD |
open-llm-leaderboard/details_Aspik101__tulu-7b-instruct-pl-lora_unload | ---
pretty_name: Evaluation run of Aspik101/tulu-7b-instruct-pl-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/tulu-7b-instruct-pl-lora_unload](https://huggingface.co/Aspik101/tulu-7b-instruct-pl-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__tulu-7b-instruct-pl-lora_unload\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T16:47:29.026992](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__tulu-7b-instruct-pl-lora_unload/blob/main/results_2023-12-02T16-47-29.026992.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Aspik101/tulu-7b-instruct-pl-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T19_08_42.181138
path:
- '**/details_harness|drop|3_2023-10-17T19-08-42.181138.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T19-08-42.181138.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T19_08_42.181138
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-08-42.181138.parquet'
- split: 2023_12_02T16_47_29.026992
path:
- '**/details_harness|gsm8k|5_2023-12-02T16-47-29.026992.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T16-47-29.026992.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T19_08_42.181138
path:
- '**/details_harness|winogrande|5_2023-10-17T19-08-42.181138.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T19-08-42.181138.parquet'
- config_name: results
data_files:
- split: 2023_10_17T19_08_42.181138
path:
- results_2023-10-17T19-08-42.181138.parquet
- split: 2023_12_02T16_47_29.026992
path:
- results_2023-12-02T16-47-29.026992.parquet
- split: latest
path:
- results_2023-12-02T16-47-29.026992.parquet
---
# Dataset Card for Evaluation run of Aspik101/tulu-7b-instruct-pl-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/tulu-7b-instruct-pl-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/tulu-7b-instruct-pl-lora_unload](https://huggingface.co/Aspik101/tulu-7b-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__tulu-7b-instruct-pl-lora_unload",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T16:47:29.026992](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__tulu-7b-instruct-pl-lora_unload/blob/main/results_2023-12-02T16-47-29.026992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
averageandyyy/brainheck_asr_test | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 1539133404.0
num_examples: 12000
download_size: 1440625199
dataset_size: 1539133404.0
---
# Dataset Card for "brainheck_asr_test"
num_examples: 12000
|
agangal/baseball-full-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: additional_feature
dtype: string
splits:
- name: train
num_bytes: 18244105.0
num_examples: 54
download_size: 18243329
dataset_size: 18244105.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chathuranga-jayanath/context-5-rhino-finmath-times4j-html-mavendoxia-wro4j-guava-supercsv-len-20000-prompt-3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: filepath
dtype: string
- name: start_bug_line
dtype: int64
- name: end_bug_line
dtype: int64
- name: bug
dtype: string
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 85202039
num_examples: 77473
- name: validation
num_bytes: 10678909
num_examples: 9684
- name: test
num_bytes: 10638644
num_examples: 9684
download_size: 32863319
dataset_size: 106519592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
CyberHarem/emilia_rezero | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of emilia (Re:Zero Kara Hajimeru Isekai Seikatsu)
This is the dataset of emilia (Re:Zero Kara Hajimeru Isekai Seikatsu), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
pinzhenchen/alpaca-cleaned-es | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
- question-answering
language:
- es
tags:
- instruction tuning
size_categories:
- 10K<n<100K
---
### Data Description
This HF data repository contains the Spanish Alpaca dataset used in our study of monolingual versus multilingual instruction tuning.
* [GitHub](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main)
* [Paper](https://arxiv.org/abs/2309.08958)
#### Creation
* Machine-translated from [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) into Spanish.
#### Usage
* This data is intended to be used for Spanish instruction tuning.
* The dataset has roughly 52K instances in the JSON format.
* Each instance has an instruction, an output, and an optional input. An example is shown below:
```
{
"instruction": "Cuáles son los tres colores primarios?",
"input": "",
"output": "Los tres colores primarios son rojo, azul y amarillo. Estos colores se llaman primarios porque no pueden ser creados mezclando otros colores y todos los otros colores se pueden hacer combinando en varias proporciones. En el sistema de color aditivo, utilizado para la luz, los colores primarios son rojo, verde y azul (RGB)."
}
```
#### Known issues
* The machine translation process might have corrupted data containing code, cross-lingual tasks, grammatical error correction tasks, etc.
#### Citation
```
@inproceedings{chen-etal-2024-monolingual,
title="Monolingual or multilingual instruction tuning: Which makes a better {Alpaca}",
author="Pinzhen Chen and Shaoxiong Ji and Nikolay Bogoychev and Andrey Kutuzov and Barry Haddow and Kenneth Heafield",
year="2024",
booktitle = "Findings of the Association for Computational Linguistics: EACL 2024",
}
``` |
HighCWu/diffusiondb_2m_first_5k_canny | ---
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3204091410
num_examples: 5000
download_size: 3203076374
dataset_size: 3204091410
license: openrail
task_categories:
- text-to-image
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for "diffusiondb_2m_first_5k_canny"
Process [diffusiondb 2m first 5k canny](https://huggingface.co/datasets/poloclub/diffusiondb) to edges by Canny algorithm.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2 | ---
pretty_name: Evaluation run of wei123602/llama2-13b-FINETUNE3_TEST2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/llama2-13b-FINETUNE3_TEST2](https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T09:53:17.709619](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2/blob/main/results_2023-10-28T09-53-17.709619.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2633179530201342,\n\
\ \"em_stderr\": 0.004510450588757746,\n \"f1\": 0.3047556627516783,\n\
\ \"f1_stderr\": 0.004459334625484884,\n \"acc\": 0.4441419290522286,\n\
\ \"acc_stderr\": 0.010548755752104734\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2633179530201342,\n \"em_stderr\": 0.004510450588757746,\n\
\ \"f1\": 0.3047556627516783,\n \"f1_stderr\": 0.004459334625484884\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12585291887793784,\n \
\ \"acc_stderr\": 0.009136212598406319\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.01196129890580315\n\
\ }\n}\n```"
repo_url: https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T06_56_58.916586
path:
- '**/details_harness|drop|3_2023-10-28T06-56-58.916586.parquet'
- split: 2023_10_28T09_53_17.709619
path:
- '**/details_harness|drop|3_2023-10-28T09-53-17.709619.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T09-53-17.709619.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T06_56_58.916586
path:
- '**/details_harness|gsm8k|5_2023-10-28T06-56-58.916586.parquet'
- split: 2023_10_28T09_53_17.709619
path:
- '**/details_harness|gsm8k|5_2023-10-28T09-53-17.709619.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T09-53-17.709619.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-51-34.438102.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T13-51-34.438102.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T06_56_58.916586
path:
- '**/details_harness|winogrande|5_2023-10-28T06-56-58.916586.parquet'
- split: 2023_10_28T09_53_17.709619
path:
- '**/details_harness|winogrande|5_2023-10-28T09-53-17.709619.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T09-53-17.709619.parquet'
- config_name: results
data_files:
- split: 2023_09_14T13_51_34.438102
path:
- results_2023-09-14T13-51-34.438102.parquet
- split: 2023_10_28T06_56_58.916586
path:
- results_2023-10-28T06-56-58.916586.parquet
- split: 2023_10_28T09_53_17.709619
path:
- results_2023-10-28T09-53-17.709619.parquet
- split: latest
path:
- results_2023-10-28T09-53-17.709619.parquet
---
# Dataset Card for Evaluation run of wei123602/llama2-13b-FINETUNE3_TEST2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/llama2-13b-FINETUNE3_TEST2](https://huggingface.co/wei123602/llama2-13b-FINETUNE3_TEST2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T09:53:17.709619](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama2-13b-FINETUNE3_TEST2/blob/main/results_2023-10-28T09-53-17.709619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2633179530201342,
"em_stderr": 0.004510450588757746,
"f1": 0.3047556627516783,
"f1_stderr": 0.004459334625484884,
"acc": 0.4441419290522286,
"acc_stderr": 0.010548755752104734
},
"harness|drop|3": {
"em": 0.2633179530201342,
"em_stderr": 0.004510450588757746,
"f1": 0.3047556627516783,
"f1_stderr": 0.004459334625484884
},
"harness|gsm8k|5": {
"acc": 0.12585291887793784,
"acc_stderr": 0.009136212598406319
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.01196129890580315
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jonathang/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1488165.0
num_examples: 4
download_size: 1489345
dataset_size: 1488165.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chenhaodev/ocn_oncc_practice_test | ---
dataset_info:
features:
- name: input
dtype: string
- name: ideal
dtype: string
splits:
- name: train
num_bytes: 42634
num_examples: 100
download_size: 21444
dataset_size: 42634
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_AbacusResearch__haLLawa4-7b | ---
pretty_name: Evaluation run of AbacusResearch/haLLawa4-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AbacusResearch/haLLawa4-7b](https://huggingface.co/AbacusResearch/haLLawa4-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AbacusResearch__haLLawa4-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T19:33:51.734148](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLawa4-7b/blob/main/results_2024-02-19T19-33-51.734148.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6506929544342681,\n\
\ \"acc_stderr\": 0.032169719018351514,\n \"acc_norm\": 0.6500916996820411,\n\
\ \"acc_norm_stderr\": 0.03283889329568593,\n \"mc1\": 0.5789473684210527,\n\
\ \"mc1_stderr\": 0.01728393624813648,\n \"mc2\": 0.7427459589364643,\n\
\ \"mc2_stderr\": 0.014232366890119735\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6936860068259386,\n \"acc_stderr\": 0.013470584417276513,\n\
\ \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7127066321449911,\n\
\ \"acc_stderr\": 0.004515748192605716,\n \"acc_norm\": 0.8835889265086636,\n\
\ \"acc_norm_stderr\": 0.0032006176493464752\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353158,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353158\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568603,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568603\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729487,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729487\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n\
\ \"mc1_stderr\": 0.01728393624813648,\n \"mc2\": 0.7427459589364643,\n\
\ \"mc2_stderr\": 0.014232366890119735\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954774\n }\n}\n```"
repo_url: https://huggingface.co/AbacusResearch/haLLawa4-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|arc:challenge|25_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|gsm8k|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hellaswag|10_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T19-33-51.734148.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T19-33-51.734148.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- '**/details_harness|winogrande|5_2024-02-19T19-33-51.734148.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T19-33-51.734148.parquet'
- config_name: results
data_files:
- split: 2024_02_19T19_33_51.734148
path:
- results_2024-02-19T19-33-51.734148.parquet
- split: latest
path:
- results_2024-02-19T19-33-51.734148.parquet
---
# Dataset Card for Evaluation run of AbacusResearch/haLLawa4-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AbacusResearch/haLLawa4-7b](https://huggingface.co/AbacusResearch/haLLawa4-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AbacusResearch__haLLawa4-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T19:33:51.734148](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLawa4-7b/blob/main/results_2024-02-19T19-33-51.734148.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6506929544342681,
"acc_stderr": 0.032169719018351514,
"acc_norm": 0.6500916996820411,
"acc_norm_stderr": 0.03283889329568593,
"mc1": 0.5789473684210527,
"mc1_stderr": 0.01728393624813648,
"mc2": 0.7427459589364643,
"mc2_stderr": 0.014232366890119735
},
"harness|arc:challenge|25": {
"acc": 0.6936860068259386,
"acc_stderr": 0.013470584417276513,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838795
},
"harness|hellaswag|10": {
"acc": 0.7127066321449911,
"acc_stderr": 0.004515748192605716,
"acc_norm": 0.8835889265086636,
"acc_norm_stderr": 0.0032006176493464752
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353158,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353158
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846177,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568603,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568603
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729487,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729487
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5789473684210527,
"mc1_stderr": 0.01728393624813648,
"mc2": 0.7427459589364643,
"mc2_stderr": 0.014232366890119735
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954774
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BubbleJoe/mscoco_augmented | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: restval
path: data/restval-*
dataset_info:
features:
- name: sentids
dtype: int64
- name: original
dtype: string
- name: role_reversed
dtype: string
- name: relation_reversed
dtype: string
- name: world_knowledge
dtype: string
splits:
- name: train
num_bytes: 107964962
num_examples: 414113
- name: test
num_bytes: 6489292
num_examples: 25010
- name: validation
num_bytes: 6517947
num_examples: 25010
- name: restval
num_bytes: 39760811
num_examples: 152634
download_size: 20725603
dataset_size: 160733012
---
# Dataset Card for "mscoco_augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceH4/testing_alpaca_small | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 33856
num_examples: 100
- name: test
num_bytes: 32475
num_examples: 100
download_size: 52543
dataset_size: 66331
---
# Dataset Card for "testing_alpaca_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PDAP/possible_homepage_urls | ---
language:
- en
pretty_name: Possible Police Agency Homepage URLs
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset aggregates potential homepage URLs for police agencies, paired with Google Search snippets that describe each homepage. It aims to facilitate research, development, and verification tasks related to digital public safety resources.
## Dataset Details
This dataset compiles ten pairs of URLs and corresponding Google Search snippets for each police agency investigated.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Police Data Accessibility Project
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/Police-Data-Accessibility-Project/data-source-identification
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset is suitable for use in projects that require the identification or verification of official police agency homepages, such as data enrichment in research databases, verification tasks for public safety applications, and training datasets for machine learning models focused on URL classification or information retrieval.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
This dataset is not intended for use in operational systems without further verification of URL authenticity. It should not be used as a sole source for critical applications that require up-to-date and officially verified data.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Each entry in the dataset represents a police agency, identified by a unique agency ID and name, and includes a list of ten URL and snippet pairs that potentially correspond to the agency's official homepage.
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The dataset was created to address the need for a comprehensive and accessible repository of potential police agency homepage URLs, to support research, development, and verification efforts in public safety and law enforcement domains.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
Data was collected using automated scripts that performed Google Searches for each police agency and extracted the top ten URLs and their corresponding snippets.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
The data was produced by automated scripts designed and implemented by the dataset curators, with manual oversight to ensure quality and relevance.
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
The dataset does not contain personal or sensitive information. URLs and snippets were collected from public Google Search results.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The dataset may reflect the biases inherent in Google Search algorithms and the potentially dynamic nature of URLs. Users should be aware that the dataset might not always represent the current official homepage of a police agency.
### Recommendations
Users are encouraged to verify the currentness and authenticity of URLs when using this dataset for critical applications. Additionally, consideration should be given to the potential biases in search engine results.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{possible_police_agency_homepage_urls,
author = {Police Data Accessibility Project},
title = {Possible Police Agency Homepage URLs Dataset},
year = {2024},
publisher = {GitHub/HuggingFace},
}
**APA:**
Police Data Accessibility Project. (2024). Possible Police Agency Homepage URLs Dataset. GitHub/HuggingFace.
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
shreyp941/shallowfake | ---
license: mit
---
|
jaban/err | ---
license: apache-2.0
---
|
nlpso/m2m3_fine_tuning_ocr_cmbert_iob2 | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m2m3_fine_tuning_ocr_cmbert_iob2
## Introduction
This dataset was used to fine-tuned [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) for **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approachrd : M2 and M3
* Dataset type : noisy (Pero OCR)
* Tokenizer : [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner)
* Tagging format : IOB2
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* M2 : [nlpso/m2_joint_label_ocr_cmbert_iob2](https://huggingface.co/nlpso/m2_joint_label_ocr_cmbert_iob2)
* M3 : [nlpso/m3_hierarchical_ner_ocr_cmbert_iob2](https://huggingface.co/nlpso/m3_hierarchical_ner_ocr_cmbert_iob2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m2m3_fine_tuning_ocr_cmbert_iob2")
|
freshpearYoon/vr_train_free_22 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6318318648
num_examples: 10000
download_size: 1150723787
dataset_size: 6318318648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a64 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r32_a64
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r32_a64](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a64)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a64\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T00:53:29.023429](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a64/blob/main/results_2024-02-10T00-53-29.023429.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5516849712339633,\n\
\ \"acc_stderr\": 0.03360527391774096,\n \"acc_norm\": 0.557506546968556,\n\
\ \"acc_norm_stderr\": 0.03432648715281793,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557978,\n \"mc2\": 0.37413701750569484,\n\
\ \"mc2_stderr\": 0.013699293033957295\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6161123282214698,\n\
\ \"acc_stderr\": 0.004853371646239246,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.0038076803311729033\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523857,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523857\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147124,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147124\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481912,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481912\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.01850814360254782,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.01850814360254782\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890474,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890474\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.02599247202930639,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.02599247202930639\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010066,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010066\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602656,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940985,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940985\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42046936114732725,\n\
\ \"acc_stderr\": 0.012607654553832705,\n \"acc_norm\": 0.42046936114732725,\n\
\ \"acc_norm_stderr\": 0.012607654553832705\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181354,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935555,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935555\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557978,\n \"mc2\": 0.37413701750569484,\n\
\ \"mc2_stderr\": 0.013699293033957295\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2304776345716452,\n \
\ \"acc_stderr\": 0.011600249020595815\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a64
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-53-29.023429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-53-29.023429.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- '**/details_harness|winogrande|5_2024-02-10T00-53-29.023429.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T00-53-29.023429.parquet'
- config_name: results
data_files:
- split: 2024_02_10T00_53_29.023429
path:
- results_2024-02-10T00-53-29.023429.parquet
- split: latest
path:
- results_2024-02-10T00-53-29.023429.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r32_a64
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r32_a64](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a64) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a64",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T00:53:29.023429](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a64/blob/main/results_2024-02-10T00-53-29.023429.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5516849712339633,
"acc_stderr": 0.03360527391774096,
"acc_norm": 0.557506546968556,
"acc_norm_stderr": 0.03432648715281793,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557978,
"mc2": 0.37413701750569484,
"mc2_stderr": 0.013699293033957295
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064663,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.6161123282214698,
"acc_stderr": 0.004853371646239246,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.0038076803311729033
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523857,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523857
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147124,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147124
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481912,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481912
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.01850814360254782,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.01850814360254782
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890474,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890474
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930639,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930639
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010066,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010066
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602656,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940985,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940985
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42046936114732725,
"acc_stderr": 0.012607654553832705,
"acc_norm": 0.42046936114732725,
"acc_norm_stderr": 0.012607654553832705
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181354,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935555,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935555
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557978,
"mc2": 0.37413701750569484,
"mc2_stderr": 0.013699293033957295
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.011878201073856544
},
"harness|gsm8k|5": {
"acc": 0.2304776345716452,
"acc_stderr": 0.011600249020595815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
GEO-Optim/geo-bench | ---
license: cc-by-sa-4.0
size_categories:
- 1K<n<10K
language:
- en
pretty_name: GEO-bench
---
# Geo-Bench
## Description
Geo-Bench is a comprehensive benchmark dataset designed for evaluating content optimization methods and Generative Engines. It consists of 10,000 queries sourced from multiple real-world and synthetically generated queries, specifically curated and repurposed for generative engines. The benchmark includes queries from nine different sources, each further categorized based on their target domain, difficulty level, query intent, and other dimensions.
## Usage
You can easily load and use Geo-Bench in Python using the `datasets` library:
```python
import datasets
# Load Geo-Bench
dataset = datasets.load_dataset("Pranjal2041/geo-bench")
```
## Data Source
Geo-Bench is a compilation of queries from various sources, both real and synthetically generated, to create a benchmark tailored for generative engines. The datasets used in constructing Geo-Bench are as follows:
1. **MS Macro, 2. ORCAS-1, and 3. Natural Questions:** These datasets contain real anonymized user queries from Bing and Google Search Engines, collectively representing common datasets used in search engine-related research.
4. **AIISouls:** This dataset contains essay questions from "All Souls College, Oxford University," challenging generative engines to perform reasoning and aggregate information from multiple sources.
5. **LIMA:** Contains challenging questions requiring generative engines to not only aggregate information but also perform suitable reasoning to answer the question, such as writing short poems or generating Python code.
6. **Davinci-Debate:** Contains debate questions generated for testing generative engines.
7. **Perplexity.ai Discover:** These queries are sourced from Perplexity.ai's Discover section, an updated list of trending queries on the platform.
8. **EII-5:** This dataset contains questions from the ELIS subreddit, where users ask complex questions and expect answers in simple, layman terms.
9. **GPT-4 Generated Queries:** To supplement diversity in query distribution, GPT-4 is prompted to generate queries ranging from various domains (e.g., science, history) and based on query intent (e.g., navigational, transactional) and difficulty levels (e.g., open-ended, fact-based).
Apart from queries, we also provide 5 cleaned html responses based on top Google search results.
## Tags
Optimizing website content often requires making targeted changes based on the domain of the task. Further, a user of GENERATIVE ENGINE OPTIMIZATION may need to find an appropriate method for only a subset of queries based on multiple factors, such as domain, user intent, query nature. To this end, we tag each of the queries based on a pool of 7 different categories. For tagging, we use the GPT-4 model and manually confirm high recall and precision in tagging. However, owing to such an automated system, the tags can be noisy and should not be considered as the sole basis for filtering or analysis.
### Difficulty Level
- The complexity of the query, ranging from simple to complex.
- Example of a simple query: "What is the capital of France?"
- Example of a complex query: "What are the implications of the Schrödinger equation in quantum mechanics?"
### Nature of Query
- The type of information sought by the query, such as factual, opinion, or comparison.
- Example of a factual query: "How does a car engine work?"
- Example of an opinion query: "What is your opinion on the Harry Potter series?"
### Genre
- The category or domain of the query, such as arts and entertainment, finance, or science.
- Example of a query in the arts and entertainment genre: "Who won the Oscar for Best Picture in 2020?"
- Example of a query in the finance genre: "What is the current exchange rate between the Euro and the US Dollar?"
### Specific Topics
- The specific subject matter of the query, such as physics, economics, or computer science.
- Example of a query on a specific topic in physics: "What is the theory of relativity?"
- Example of a query on a specific topic in economics: "What is the law of supply and demand?"
### Sensitivity
- Whether the query involves sensitive topics or not.
- Example of a non-sensitive query: "What is the tallest mountain in the world?"
- Example of a sensitive query: "What is the current political situation in North Korea?"
### User Intent
- The purpose behind the user's query, such as research, purchase, or entertainment.
- Example of a research intent query: "What are the health benefits of a vegetarian diet?"
- Example of a purchase intent query: "Where can I buy the latest iPhone?"
### Answer Type
- The format of the answer that the query is seeking, such as fact, opinion, or list.
- Example of a fact answer type query: "What is the population of New York City?"
- Example of an opinion answer type query: "Is it better to buy or rent a house?"
## Additional Information
Geo-Bench is intended for research purposes and provides valuable insights into the challenges and opportunities of content optimization for generative engines. Please refer to the [GEO paper](https://arxiv.org/abs/2310.18xxx) for more details.
---
## Data Examples
### Example 1
```json
{
"query": "Why is the smell of rain pleasing?",
"tags": ['informational', 'simple', 'non-technical', 'science', 'research', 'non-sensitive'],
"sources": List[str],
}
```
### Example 2
```json
{
"query": "Can foxes be domesticated?",
"tags": ['informational', 'non-technical', 'pets and animals', 'fact', 'non-sensitive'],
"sources": List[str],
}
```
---
## License
Geo-Bench is released under the [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) license.
## Dataset Size
The dataset contains 8K queries for train, 1k queries for val and 1k for tesst.
---
## Contributions
We welcome contributions and feedback to improve Geo-Bench. You can contribute by reporting issues or submitting improvements through the [GitHub repository](https://github.com/Pranjal2041/GEO/tree/main/GEO-Bench).
## How to Cite
When using Geo-Bench in your work, please include a proper citation. You can use the following citation as a reference:
```
@misc{Aggarwal2023geo,
title={{GEO}: Generative Engine Optimization},
author={Pranjal Aggarwal and Vishvak Murahari and Tanmay Rajpurohit and Ashwin Kalyan and Karthik R Narasimhan and Ameet Deshpande},
year={2023},
eprint={2310.18xxx},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
keremberke/forklift-object-detection | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
- Manufacturing
---
<div align="center">
<img width="640" alt="keremberke/forklift-object-detection" src="https://huggingface.co/datasets/keremberke/forklift-object-detection/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['forklift', 'person']
```
### Number of Images
```json
{'test': 42, 'valid': 84, 'train': 295}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("keremberke/forklift-object-detection", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/mohamed-traore-2ekkp/forklift-dsitv/dataset/1](https://universe.roboflow.com/mohamed-traore-2ekkp/forklift-dsitv/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ forklift-dsitv_dataset,
title = { Forklift Dataset },
type = { Open Source Dataset },
author = { Mohamed Traore },
howpublished = { \\url{ https://universe.roboflow.com/mohamed-traore-2ekkp/forklift-dsitv } },
url = { https://universe.roboflow.com/mohamed-traore-2ekkp/forklift-dsitv },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { mar },
note = { visited on 2023-01-15 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.ai on April 3, 2022 at 9:01 PM GMT
It includes 421 images.
Forklift are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
No image augmentation techniques were applied.
|
pesc101/spyder-ide-lbl-all-2x-low-teacher | ---
dataset_info:
features:
- name: meta_data
struct:
- name: contains_class
dtype: bool
- name: contains_function
dtype: bool
- name: end_line
dtype: int64
- name: file_imports
sequence: string
- name: file_name
dtype: string
- name: module
dtype: string
- name: start_line
dtype: int64
- name: code
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 59794352
num_examples: 15781
download_size: 16584707
dataset_size: 59794352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KK1mo/tedigan_1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: caption
dtype: string
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 58927475.0
num_examples: 500
download_size: 58912429
dataset_size: 58927475.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xi0v/UltraInteract-SFT-Instruct | ---
language:
- en
dataset_info:
splits:
- name: train
num_bytes: 687238
num_examples: 288579
download_size: 687238
dataset_size: 687238
size_categories:
- 100K<n<1M
---
## Info
- ### [Original UltraInteract_sft](https://huggingface.co/datasets/openbmb/UltraInteract_sft/)
- ### This dataset is Formatted to Follow Mistral-7B-Instruct and llama2 Based Models.
## Introduction
- 📜 [Paper](https://arxiv.org/abs/2404.02078)
- 🤗 UltraInteract
- [SFT](https://huggingface.co/datasets/openbmb/UltraInteract_sft)
- [Preference Learning](https://huggingface.co/datasets/openbmb/UltraInteract_pair)
- [GitHub Repo](https://github.com/OpenBMB/Eurus)
UltraInteract is a large-scale, high-quality alignment dataset specifically designed for complex reasoning tasks. For each instruction, it includes a preference tree consisting of
- (1) reasoning chains with diverse planning strategies in a unified format
- (2) multi-turn interaction trajectories with the environment and the critique
- (3) pairwise data to facilitate preference learning
## Structure
UltraInteract collects a preference tree for each instruction, with the instruction being the root and each action a node. A trajectory is a root-to-leaf path consisting of a sequence of actions. In each preference tree, all nodes of correct actions and all trajectories ending with correct actions can be used for SFT. Paired correct and incorrect nodes or trajectories can be used for preference learning.
<img src="./figures/tree.png" alt="tree" style="zoom: 20%;" />
## Illustrative Example
Here is an illustrative example of an UltraInteract trajectory over two turns. In each turn, the actor model generates step-by-step reasoning chains, and the environment and the critique model provide observations and textual critique respectively.
<img src="./figures/ui_example.png" alt="ui_example" style="zoom: 25%;" />
## Stats
Below are some statistics about UltraInteract. It consists of 86k instructions, 286k correct answers, and 219k pairs.
<img src="./figures/stats.png" alt="stats" style="zoom: 40%;" />
## Citation
```bib
@misc{yuan2024advancing,
title={Advancing LLM Reasoning Generalists with Preference Trees},
author={Lifan Yuan and Ganqu Cui and Hanbin Wang and Ning Ding and Xingyao Wang and Jia Deng and Boji Shan and Huimin Chen and Ruobing Xie and Yankai Lin and Zhenghao Liu and Bowen Zhou and Hao Peng and Zhiyuan Liu and Maosong Sun},
year={2024},
eprint={2404.02078},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
``` |
hmzkhnswt/cutomized_customerDataset | ---
dataset_info:
features:
- name: query
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 15256
num_examples: 74
download_size: 8215
dataset_size: 15256
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
backblaze/Drive_Stats | ---
license:
- other
license_details: 'https://www.backblaze.com/cloud-storage/resources/hard-drive-test-data#howYouCanUseTheData'
annotations_creators:
- 'machine-generated'
pretty_name: 'Drive Stats'
size_categories:
- '100M<n<1B'
---
# Drive Stats
[**Drive Stats**](https://www.backblaze.com/cloud-storage/resources/hard-drive-test-data) is a public data set of daily metrics on the hard drives in Backblaze’s [cloud storage infrastructure](https://www.backblaze.com/cloud-storage) that Backblaze has open-sourced since April 2013. Currently, Drive Stats comprises over 388 million records, rising by over 240,000 records per day. Drive Stats is an append-only dataset effectively logging daily statistics that once written are never updated or deleted.
This is our first Hugging Face dataset; feel free to suggest improvements by creating a new discussion on the [Community](https://huggingface.co/datasets/backblaze/Drive_Stats/discussions)!
## Drive Stats Q2 2023 Snapshot
* Drive Count: 240,940
* Drive Failures: 1,339
* Drive Days: 21.1M
* Annualized Failure Rate: 2.28%
## Overview of the Hard Drive Data
Each day in the Backblaze data center, we take a snapshot of each operational hard drive. This snapshot includes basic drive information along with the S.M.A.R.T. statistics reported by that drive. The daily snapshot of one drive is one record or row of data. All of the drive snapshots for a given day are collected into a file consisting of a row for each active hard drive. The format of this file is a "csv" (Comma Separated Values) file. Each day this file is named in the format YYYY-MM-DD.csv, for example, 2013-04-10.csv.
The first row of the each file contains the column names, the remaining rows are the actual data. The columns are as follows:
* Date – The date of the snapshot in yyyy-mm-dd format.
* Serial Number – The manufacturer-assigned serial number of the drive.
* Model – The manufacturer-assigned model number of the drive.
* Capacity – The drive capacity in bytes.
* Failure – Contains a “0” if the drive is OK. Contains a “1” if this is the last day the drive was operational before failing.
* SMART Stats:
* 2013-2014: 80 columns of data, that are the Raw and Normalized values for 40 different SMART stats as reported by the given drive. Each value is the number reported by the drive.
* 2015-2017: 90 columns of data, that are the Raw and Normalized values for 45 different SMART stats as reported by the given drive. Each value is the number reported by the drive.
* 2018 (Q1): 100 columns of data, that are the Raw and Normalized values for 50 different SMART stats as reported by the given drive. Each value is the number reported by the drive.
* 2018 (Q2): 104 columns of data, that are the Raw and Normalized values for 52 different SMART stats as reported by the given drive. Each value is the number reported by the drive.
* 2018 (Q4): 124 columns of data, that are the Raw and Normalized values for 62 different SMART stats as reported by the given drive. Each value is the number reported by the drive.
## Helpful Hints and Caveats
### Schema Changes
The schema may change from quarter to quarter. The basic information: date, serial_number, model, capacity_bytes, and failure will not change. All of the changes will be in the number of SMART attributes reported for all of the drives in a given quarter. There will never be more than 255 pair of SMART attributes reported. When you load the CSV files for each quarter you will need to account for the potential of a different number of SMART attributes from the previous quarter.
## How You Can Use the Data
You can download and use this data for free for your own purpose, all we ask is three things:
* you cite Backblaze as the source if you use the data,
* you accept that you are solely responsible for how you use the data, and
* you do not sell this data to anyone, it is free. |
kreem22/kreemdata | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-generation
pretty_name: UltraChat 200k
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 1397058554
num_examples: 207865
- name: test_sft
num_bytes: 154695659
num_examples: 23110
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: test_gen
num_bytes: 148276089
num_examples: 28304
download_size: 1624049723
dataset_size: 3047427114
---
# Dataset Card for UltraChat 200k
## Dataset Description
This is a heavily filtered version of the [UltraChat](https://github.com/thunlp/UltraChat) dataset and was used to train [Zephyr-7B-β](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), a state of the art 7b chat model.
The original datasets consists of 1.4M dialogues generated by ChatGPT and spanning a wide range of topics. To create `UltraChat 200k`, we applied the following logic:
- Selection of a subset of data for faster supervised fine tuning.
- Truecasing of the dataset, as we observed around 5% of the data contained grammatical errors like "Hello. how are you?" instead of "Hello. How are you?"
- Removal of dialogues where the assistant replies with phrases like "I do not have emotions" or "I don't have opinions", even for fact-based prompts that don't involve either.
## Dataset Structure
The dataset has four splits, suitable for:
* Supervised fine-tuning (`sft`).
* Generation ranking (`gen`) via techniques like rejection sampling or PPO.
The number of examples per split is shown as follows:
| train_sft | test_sft | train_gen | test_gen |
|:-------:|:-----------:|:-----:| :-----:|
| 207865 | 23110 | 256032 | 28304 |
The dataset is stored in parquet format with each entry using the following schema:
```
{
"prompt": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"messages":[
{
"content": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"role": "user"
},
{
"content": "Name: Ava\n\n Ava was just 16 years old when the world as she knew it came crashing down. The government had collapsed, leaving behind a chaotic and lawless society. ...",
"role": "assistant"
},
{
"content": "Wow, Ava's story is so intense and inspiring! Can you provide me with more details. ...",
"role": "user"
},
{
"content": "Certainly! ....",
"role": "assistant"
},
{
"content": "That's really interesting! I would love to hear more...",
"role": "user"
}
{
"content": "Certainly! ....",
"role": "assistant"
},
],
"prompt_id": "d938b65dfe31f05f80eb8572964c6673eddbd68eff3db6bd234d7f1e3b86c2af"
}
```
## Citation
If you find this dataset is useful in your work, please cite the original UltraChat dataset:
```
@misc{ding2023enhancing,
title={Enhancing Chat Language Models by Scaling High-quality Instructional Conversations},
author={Ning Ding and Yulin Chen and Bokai Xu and Yujia Qin and Zhi Zheng and Shengding Hu and Zhiyuan Liu and Maosong Sun and Bowen Zhou},
year={2023},
eprint={2305.14233},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
You may also wish to cite the Zephyr 7B technical report:
```
@misc{tunstall2023zephyr,
title={Zephyr: Direct Distillation of LM Alignment},
author={Lewis Tunstall and Edward Beeching and Nathan Lambert and Nazneen Rajani and Kashif Rasul and Younes Belkada and Shengyi Huang and Leandro von Werra and Clémentine Fourrier and Nathan Habib and Nathan Sarrazin and Omar Sanseviero and Alexander M. Rush and Thomas Wolf},
year={2023},
eprint={2310.16944},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
Jatme26/test-conv-dataset | ---
license: mit
---
|
godivyam/business-companies-news-dataset | ---
license: mit
---
|
Mayaru01/SD-NAI-ANIMESTYLEMODELS | ---
license: openrail
---
|
argilla/notus-uf-dpo-full | ---
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: chosen_model
dtype: string
- name: chosen_rating
dtype: float64
- name: chosen_response
dtype: string
- name: rejected_responses
sequence: string
- name: rejected_ratings
sequence: float64
splits:
- name: train
num_bytes: 319830690
num_examples: 63966
download_size: 165861726
dataset_size: 319830690
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jlbaker361/avatarkorra | ---
dataset_info:
features:
- name: image
dtype: image
- name: src
dtype: string
- name: split
dtype: string
- name: id
dtype: int64
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3054325805.25
num_examples: 13686
download_size: 3052884339
dataset_size: 3054325805.25
---
# Dataset Card for "avatarkorra"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/gr_mg4_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gr_mg4/GrMG4/MG4 (Girls' Frontline)
This is the dataset of gr_mg4/GrMG4/MG4 (Girls' Frontline), containing 77 images and their tags.
The core tags of this character are `long_hair, hair_ornament, hairclip, yellow_eyes, bangs, very_long_hair, hair_between_eyes, grey_hair, breasts, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 77 | 100.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg4_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 77 | 57.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg4_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 174 | 112.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg4_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 77 | 87.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg4_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 174 | 155.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg4_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_mg4_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_shirt, black_shorts, long_sleeves, short_shorts, simple_background, thigh_strap, black_scarf, blush, closed_mouth, green_jacket, open_jacket, black_footwear, boots, brown_eyes, machine_gun, white_background, armband, holding_gun, black_necktie, full_body, thigh_holster |
| 1 | 19 |  |  |  |  |  | 1girl, solo, serafuku, black_skirt, looking_at_viewer, black_pantyhose, blush, pleated_skirt, white_shirt, simple_background, jacket, sailor_collar, long_sleeves, official_alternate_costume, white_background, black_gloves, closed_mouth, full_body, gun, headphones, open_clothes, black_footwear, cardigan, green_neckerchief, holding, open_mouth, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_shirt | black_shorts | long_sleeves | short_shorts | simple_background | thigh_strap | black_scarf | blush | closed_mouth | green_jacket | open_jacket | black_footwear | boots | brown_eyes | machine_gun | white_background | armband | holding_gun | black_necktie | full_body | thigh_holster | serafuku | black_skirt | black_pantyhose | pleated_skirt | jacket | sailor_collar | official_alternate_costume | black_gloves | gun | headphones | open_clothes | cardigan | green_neckerchief | holding | open_mouth | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:---------------|:---------------|:---------------|:--------------------|:--------------|:--------------|:--------|:---------------|:---------------|:--------------|:-----------------|:--------|:-------------|:--------------|:-------------------|:----------|:--------------|:----------------|:------------|:----------------|:-----------|:--------------|:------------------|:----------------|:---------|:----------------|:-----------------------------|:---------------|:------|:-------------|:---------------|:-----------|:--------------------|:----------|:-------------|:----------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | X | | X | | X | | | X | X | | | X | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
tyzhu/find_second_sent_train_500_eval_20_baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 751577
num_examples: 442
- name: validation
num_bytes: 37982
num_examples: 20
download_size: 0
dataset_size: 789559
---
# Dataset Card for "find_second_sent_train_500_eval_20_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mchen72/amazon-shoe-reviews | ---
dataset_info:
features:
- name: labels
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 16847665.2
num_examples: 90000
- name: test
num_bytes: 1871962.8
num_examples: 10000
download_size: 11140374
dataset_size: 18719628.0
---
# Dataset Card for "amazon-shoe-reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/heanna_sumire_lovelivesuperstar | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of heanna_sumire/平安名すみれ/헤안나스미레 (Love Live! Superstar!!)
This is the dataset of heanna_sumire/平安名すみれ/헤안나스미레 (Love Live! Superstar!!), containing 500 images and their tags.
The core tags of this character are `blonde_hair, bangs, green_eyes, long_hair, blunt_bangs, hairband, breasts, ribbon, red_hairband, red_ribbon, neck_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 714.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heanna_sumire_lovelivesuperstar/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 353.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heanna_sumire_lovelivesuperstar/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1209 | 782.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heanna_sumire_lovelivesuperstar/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 605.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heanna_sumire_lovelivesuperstar/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1209 | 1.19 GiB | [Download](https://huggingface.co/datasets/CyberHarem/heanna_sumire_lovelivesuperstar/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/heanna_sumire_lovelivesuperstar',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, blue_jacket, grey_dress, looking_at_viewer, solo, white_shirt, yuigaoka_school_uniform, open_jacket, pinafore_dress, simple_background, collared_shirt, white_background, closed_mouth, smile, long_sleeves, blush, upper_body, orange_hairband |
| 1 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, pinafore_dress, short_sleeves, solo, white_background, white_shirt, yuigaoka_school_uniform, blush, closed_mouth, collared_shirt, simple_background, smile, grey_dress, hand_on_hip, upper_body |
| 2 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, skirt, smile, solo, birthday, open_mouth, white_thighhighs, zettai_ryouiki, jacket, medium_breasts, one_eye_closed |
| 3 | 44 |  |  |  |  |  | 1girl, solo, looking_at_viewer, drill_hair, elbow_gloves, tiara, smile, purple_dress, white_gloves, puffy_short_sleeves, blush, upper_body, pearl_necklace, collarbone, purple_gloves |
| 4 | 16 |  |  |  |  |  | 1girl, crop_top, midriff, solo, eyewear_on_headwear, sunglasses, baseball_cap, looking_at_viewer, navel, green_shirt, collarbone, red_headwear, shorts, white_thighhighs, blush, medium_breasts, teeth, white_background, grin, hand_on_hip, one_eye_closed, short_sleeves, simple_background |
| 5 | 18 |  |  |  |  |  | 1girl, solo, looking_at_viewer, miko, red_hakama, skirt, holding, wide_sleeves, broom, smile, blush, white_kimono |
| 6 | 6 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, simple_background, solo, white_background, collarbone, large_breasts, navel, :o, cowboy_shot, thighs, white_bikini |
| 7 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, solo, white_background, white_thighhighs, ass, medium_breasts, white_panties, blush, shiny_skin, thighs, anus, from_behind, lingerie, looking_back, lying, nipples, thong, white_bra |
| 8 | 6 |  |  |  |  |  | green_bikini, hair_ornament, looking_at_viewer, necklace, star_(symbol), blush, cleavage, navel, one_eye_closed, smile, 1girl, bare_shoulders, collarbone, large_breasts, medium_breasts, outdoors, side_ponytail, blue_sky, cloud, day, frills, single_hair_bun, solo_focus, water |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_jacket | grey_dress | looking_at_viewer | solo | white_shirt | yuigaoka_school_uniform | open_jacket | pinafore_dress | simple_background | collared_shirt | white_background | closed_mouth | smile | long_sleeves | blush | upper_body | orange_hairband | short_sleeves | hand_on_hip | skirt | birthday | open_mouth | white_thighhighs | zettai_ryouiki | jacket | medium_breasts | one_eye_closed | drill_hair | elbow_gloves | tiara | purple_dress | white_gloves | puffy_short_sleeves | pearl_necklace | collarbone | purple_gloves | crop_top | midriff | eyewear_on_headwear | sunglasses | baseball_cap | navel | green_shirt | red_headwear | shorts | teeth | grin | miko | red_hakama | holding | wide_sleeves | broom | white_kimono | cleavage | large_breasts | :o | cowboy_shot | thighs | white_bikini | ass | white_panties | shiny_skin | anus | from_behind | lingerie | looking_back | lying | nipples | thong | white_bra | green_bikini | hair_ornament | necklace | star_(symbol) | bare_shoulders | outdoors | side_ponytail | blue_sky | cloud | day | frills | single_hair_bun | solo_focus | water |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------------|:--------------------|:-------|:--------------|:--------------------------|:--------------|:-----------------|:--------------------|:-----------------|:-------------------|:---------------|:--------|:---------------|:--------|:-------------|:------------------|:----------------|:--------------|:--------|:-----------|:-------------|:-------------------|:-----------------|:---------|:-----------------|:-----------------|:-------------|:---------------|:--------|:---------------|:---------------|:----------------------|:-----------------|:-------------|:----------------|:-----------|:----------|:----------------------|:-------------|:---------------|:--------|:--------------|:---------------|:---------|:--------|:-------|:-------|:-------------|:----------|:---------------|:--------|:---------------|:-----------|:----------------|:-----|:--------------|:---------|:---------------|:------|:----------------|:-------------|:-------|:--------------|:-----------|:---------------|:--------|:----------|:--------|:------------|:---------------|:----------------|:-----------|:----------------|:-----------------|:-----------|:----------------|:-----------|:--------|:------|:---------|:------------------|:-------------|:--------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | X | X | X | | X | X | X | X | X | X | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | X | X | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 44 |  |  |  |  |  | X | | | X | X | | | | | | | | | X | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 16 |  |  |  |  |  | X | | | X | X | | | | | X | | X | | | | X | | | X | X | | | | X | | | X | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 18 |  |  |  |  |  | X | | | X | X | | | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | X | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | X | | | | | X | | X | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | X | | | | | | | | | | X | | X | | | | | | | | | | | X | X | | | | | | | | X | | | | | | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
alexcom/analisis-sentimientos-textos-turisitcos-mx-polaridadV2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 91784268
num_examples: 226531
- name: test
num_bytes: 10317131
num_examples: 25171
download_size: 63487460
dataset_size: 102101399
---
# Dataset Card for "analisis-sentimientos-textos-turisitcos-mx-polaridadV2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pollner/ser | ---
license: mit
---
|
BangumiBase/stringendoangeltachinoprivatelesson | ---
license: mit
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Bangumi Image Base of Stringendo: Angel-tachi No Private Lesson
This is the image base of bangumi Stringendo: Angel-tachi no Private Lesson, we detected 15 characters, 956 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 123 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 41 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 30 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 175 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 82 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 80 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 75 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 22 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 20 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 112 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 17 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 33 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 10 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 46 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 90 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  | |
helvioviana/CloneHelvio | ---
license: openrail
---
|
ManuelS249/jotest | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 18360068.0
num_examples: 8
- name: test
num_bytes: 28845849.0
num_examples: 9
download_size: 40648104
dataset_size: 47205917.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
daniilak/Russia_Real_Estate_2021 | ---
license: cc
---
Real estate ads in Russia are published on the websites avito.ru, realty.yandex.ru, cian.ru, sob.ru, youla.ru, n1.ru, moyareklama.ru. The ads-api.ru service allows you to upload real estate ads for a fee. The parser of the service works strangely and duplicates real estate ads in the database if the authors extended them after some time. Also in the Russian market there are a lot of outbids (bad realtors) who steal ads and publish them on their own behalf. Before publishing this dataset, my task was to select the original ad from a bunch of ads.
Russian real estate services allow ad authors to manually write data about an apartment or house. Therefore, it often happens that a user can publish an ad with errors or typos. Also, the user may not know, for example, the type of walls near his house.
The user also specifies the address of the object being sold. He may make a mistake and simply indicate the address, for example, "Moscow". Which street? Which house? We will never know.
# Dataset
The real estate market in Russia is of two types, in the dataset it is used as object type 0 - Secondary real estate market; 2 - New building.
I found it necessary to determine the geolocation for each ad address and add the coordinates to this dataset. Also there is a number of the region of Russia. For example, the number of the Chuvash region is 21. Additionally, there is a house number that is synchronized through the federal public database of the Federal Tax Service "FIAS". Since the data is obtained through a paid third party service, I cannot publish the results, however, I can anonymize them and publish parameters such as Street ID and House ID.
Basically, all houses are built from blocks such as brick, wood, panel and others. I marked them with numbers: building type - 0 - Don't know. 1 - Other. 2 - panel. 3 - Monolithic. 4 - Brick. 5 - blocky. 6- Wooden
The number of rooms can also be as 1, 2 or more. However, there is a type of apartment that is called a studio apartment. I've labeled them "-1".
# Ideas
I hope that the publication of this dataset will improve developments in the field of global real estate.
You can create apartment price forecasts.
You can analyze real estate markets.
You can understand that there is a need to publish free real estate datasets.
And much more
# Others
The license for this dataset is public, you can use it in your scientific research, design work and other works. The only condition is the publication of a link to this dataset.
You can send suggestions (or complaints) on the dataset by mail daniilakk@gmail.com
You can find more information about the data on the website https://dom.realtycloud.ru/
|
cwchoi/whisper_medium_ptt | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 202660216
num_examples: 211
- name: test
num_bytes: 25931848
num_examples: 27
- name: valid
num_bytes: 24971896
num_examples: 26
download_size: 35302562
dataset_size: 253563960
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
OpenHust/vietnamese-summarization | ---
task_categories:
- summarization
language:
- vi
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kaliansh/oneapi | ---
license: unknown
---
|
AhM19/chatdoctor-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 191764375
num_examples: 207408
download_size: 117731869
dataset_size: 191764375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
OmAlve/Real-VS-AI-Art | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AI_Art
'1': Real_Art
splits:
- name: train
num_bytes: 503590732.0
num_examples: 972
download_size: 501372432
dataset_size: 503590732.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/lotte_science_dev | ---
pretty_name: '`lotte/science/dev`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/science/dev`
The `lotte/science/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/science/dev).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=343,642
This dataset is used by: [`lotte_science_dev_forum`](https://huggingface.co/datasets/irds/lotte_science_dev_forum), [`lotte_science_dev_search`](https://huggingface.co/datasets/irds/lotte_science_dev_search)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/lotte_science_dev', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
kainatq/emelly | ---
license: apache-2.0
---
|
minoruskore/wod8781nuo348jg5wf0832 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: mark
dtype: string
- name: model
dtype: string
- name: year
dtype: int64
- name: mileage
dtype: int64
- name: vol_engine
dtype: int64
- name: fuel
dtype: string
- name: price
dtype: int64
splits:
- name: train
num_bytes: 6622964
num_examples: 94585
- name: test
num_bytes: 1633943
num_examples: 23342
download_size: 2026065
dataset_size: 8256907
---
# Dataset Card for "wod8781nuo348jg5wf0832"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mehdiiraqui/twitter_disaster | ---
language:
- en
tags:
- disaster-classification
- text classification
- NLP
--- |
OdiaGenAI/all_combined_odia_171k | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- or
pretty_name: all_combined_odia_171K
size_categories:
- 100K<n<1M
---
# Dataset Card for all_combined_odia_171K
## Dataset Description
- **Homepage: https://www.odiagenai.org/**
- **Repository: https://github.com/shantipriyap/OdiaGenAI**
- **Point of Contact: Shantipriya Parida, and Sambit Sekhar**
### Dataset Summary
This dataset is a mix of Odia instruction sets translated from open-source instruction sets.
The Odia instruction sets used are:
* dolly-odia-15k
* OdiEnCorp_translation_instructions_25k
* gpt-teacher-roleplay-odia-3k
* Odia_Alpaca_instructions_52k
* hardcode_odia_qa_105
In this dataset Odia instruction, input, and output strings are available.
### Supported Tasks and Leaderboards
Large Language Model (LLM)
### Languages
Odia
## Dataset Structure
JSON
### Data Fields
output (string)
data_source (string)
instruction (string)
input (string)
### Licensing Information
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg
### Citation Information
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{OdiaGenAI,
author = {Shantipriya Parida and Sambit Sekhar and Subhadarshi Panda and Soumendra Kumar Sahoo and Swateek Jena and Abhijeet Parida and Arghyadeep Sen and Satya Ranjan Dash and Deepak Kumar Pradhan},
title = {OdiaGenAI: Generative AI and LLM Initiative for the Odia Language},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
### Contributions
- Shantipriya Parida
- Sambit Sekhar |
Jing24/seperate_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 6991061
num_examples: 7503
download_size: 1295926
dataset_size: 6991061
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713163345 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2402050
num_examples: 7031
download_size: 1383342
dataset_size: 2402050
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
J4YL19/biored_tokenized | ---
dataset_info:
features:
- name: pmid
dtype: string
- name: passage
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 2259680
num_examples: 387
- name: val
num_bytes: 604670
num_examples: 98
- name: test
num_bytes: 576610
num_examples: 97
download_size: 1083246
dataset_size: 3440960
---
# Dataset Card for "biored_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
evertonk/Everton | ---
license: openrail
---
|
Heng666/Taiwan-patent-qa | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 1316102
num_examples: 1215
download_size: 360226
dataset_size: 1316102
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc
task_categories:
- question-answering
language:
- zh
tags:
- traditional chinese
- taiwan
pretty_name: taiwan_patent_qa
size_categories:
- 1K<n<10K
---
# 台灣經濟部智慧財產局問答集
我們提出適用於 QA 系統上用的專利問答集,主要內容收錄智慧財產局開放性問答,高達 1K 問答量。旨在提高語言模型在台灣領域上落地場景。
<p align="center">
<img src="https://huggingface.co/datasets/Heng666/Taiwan-patent-qa/resolve/main/Image Creator.jpeg" style="max-width: 400" width=400 />
</p>
# Citation
```
@article{TaiwanPatent2024,
title={An Patent QA for Taiwan Language Model},
author={Heng-Shiou Sheu},
journal={arXiv},
year={2024}
}
``` |
fathyshalab/reklambox-filtered | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: label_name
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
- name: sentence_length
dtype: int64
splits:
- name: test
num_bytes: 281204
num_examples: 350
- name: train
num_bytes: 643860
num_examples: 808
download_size: 554464
dataset_size: 925064
---
# Dataset Card for "reklambox-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium | ---
pretty_name: Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhiramtirumala/DialoGPT-sarcastic-medium](https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T00:36:45.634956](https://huggingface.co/datasets/open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium/blob/main/results_2023-09-23T00-36-45.634956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\
acc\": 0.26677190213101815,\n \"acc_stderr\": 0.007010413338799049\n },\n\
\ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\
\ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5335438042620363,\n \"acc_stderr\": 0.014020826677598098\n\
\ }\n}\n```"
repo_url: https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T18_48_39.393988
path:
- '**/details_harness|drop|3_2023-09-17T18-48-39.393988.parquet'
- split: 2023_09_23T00_36_45.634956
path:
- '**/details_harness|drop|3_2023-09-23T00-36-45.634956.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T00-36-45.634956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T18_48_39.393988
path:
- '**/details_harness|gsm8k|5_2023-09-17T18-48-39.393988.parquet'
- split: 2023_09_23T00_36_45.634956
path:
- '**/details_harness|gsm8k|5_2023-09-23T00-36-45.634956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T00-36-45.634956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:39:52.332273.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:39:52.332273.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T18_48_39.393988
path:
- '**/details_harness|winogrande|5_2023-09-17T18-48-39.393988.parquet'
- split: 2023_09_23T00_36_45.634956
path:
- '**/details_harness|winogrande|5_2023-09-23T00-36-45.634956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T00-36-45.634956.parquet'
- config_name: results
data_files:
- split: 2023_07_19T10_39_52.332273
path:
- results_2023-07-19T10:39:52.332273.parquet
- split: 2023_09_17T18_48_39.393988
path:
- results_2023-09-17T18-48-39.393988.parquet
- split: 2023_09_23T00_36_45.634956
path:
- results_2023-09-23T00-36-45.634956.parquet
- split: latest
path:
- results_2023-09-23T00-36-45.634956.parquet
---
# Dataset Card for Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [abhiramtirumala/DialoGPT-sarcastic-medium](https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T00:36:45.634956](https://huggingface.co/datasets/open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium/blob/main/results_2023-09-23T00-36-45.634956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0,
"acc": 0.26677190213101815,
"acc_stderr": 0.007010413338799049
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5335438042620363,
"acc_stderr": 0.014020826677598098
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Atipico1/nq-test-replace-format | ---
dataset_info:
features:
- name: question
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entities
dtype: string
- name: entities_count
dtype: int64
- name: adv_sent
dtype: string
- name: adv_passage
dtype: string
- name: cos_sim
dtype: float64
- name: answer_match
dtype: bool
- name: is_valid_adversary
dtype: bool
- name: hasanswer
dtype: bool
- name: is_adversarial
dtype: bool
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 70347010
num_examples: 3610
download_size: 41260388
dataset_size: 70347010
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1 | ---
pretty_name: Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adamo1139/Yi-34B-AEZAKMI-v1](https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T22:17:18.926595](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1/blob/main/results_2023-12-04T22-17-18.926595.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.733063777345197,\n\
\ \"acc_stderr\": 0.02911576095445218,\n \"acc_norm\": 0.7392718490739228,\n\
\ \"acc_norm_stderr\": 0.029657906091365063,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.01716027390169365,\n \"mc2\": 0.557340774150812,\n\
\ \"mc2_stderr\": 0.015053849366752348\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.606655290102389,\n \"acc_stderr\": 0.014275101465693024,\n\
\ \"acc_norm\": 0.643344709897611,\n \"acc_norm_stderr\": 0.01399805690262019\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6422027484564827,\n\
\ \"acc_stderr\": 0.004783723798286501,\n \"acc_norm\": 0.8430591515634336,\n\
\ \"acc_norm_stderr\": 0.0036300159898964017\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\
\ \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n\
\ \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549912,\n\
\ \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549912\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.025757559893106737,\n\
\ \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.025757559893106737\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093278,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093278\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7404255319148936,\n \"acc_stderr\": 0.028659179374292326,\n\
\ \"acc_norm\": 0.7404255319148936,\n \"acc_norm_stderr\": 0.028659179374292326\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6507936507936508,\n \"acc_stderr\": 0.024552292209342658,\n \"\
acc_norm\": 0.6507936507936508,\n \"acc_norm_stderr\": 0.024552292209342658\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n\
\ \"acc_stderr\": 0.01730838128103453,\n \"acc_norm\": 0.896774193548387,\n\
\ \"acc_norm_stderr\": 0.01730838128103453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n\
\ \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781657,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781657\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295136,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295136\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7769230769230769,\n \"acc_stderr\": 0.02110773012724401,\n \
\ \"acc_norm\": 0.7769230769230769,\n \"acc_norm_stderr\": 0.02110773012724401\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227638,\n \
\ \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227638\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.024528664971305424,\n\
\ \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.024528664971305424\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603396,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603396\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9119266055045872,\n\
\ \"acc_stderr\": 0.012150743719481693,\n \"acc_norm\": 0.9119266055045872,\n\
\ \"acc_norm_stderr\": 0.012150743719481693\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.6342592592592593,\n \"acc_stderr\": 0.032847388576472056,\n\
\ \"acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.032847388576472056\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\"\
: 0.9019607843137255,\n \"acc_norm_stderr\": 0.0208711184555521\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.890295358649789,\n \"acc_stderr\": 0.020343400734868837,\n \"\
acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868837\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.0283116014414386,\n\
\ \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.0283116014414386\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.6517857142857143,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881347,\n\
\ \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881347\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.01789378490401854,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.01789378490401854\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.896551724137931,\n\
\ \"acc_stderr\": 0.010890452544691499,\n \"acc_norm\": 0.896551724137931,\n\
\ \"acc_norm_stderr\": 0.010890452544691499\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8063583815028902,\n \"acc_stderr\": 0.021274230317515557,\n\
\ \"acc_norm\": 0.8063583815028902,\n \"acc_norm_stderr\": 0.021274230317515557\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7027932960893855,\n\
\ \"acc_stderr\": 0.0152853133536416,\n \"acc_norm\": 0.7027932960893855,\n\
\ \"acc_norm_stderr\": 0.0152853133536416\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880945,\n\
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880945\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.02118589361522516,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.02118589361522516\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5921985815602837,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.5921985815602837,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5814863102998696,\n\
\ \"acc_stderr\": 0.012599505608336477,\n \"acc_norm\": 0.5814863102998696,\n\
\ \"acc_norm_stderr\": 0.012599505608336477\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7810457516339869,\n \"acc_stderr\": 0.016729937565537558,\n \
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.016729937565537558\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.023420972069166344,\n\
\ \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.023420972069166344\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.02116621630465939,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.02116621630465939\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355024,\n\
\ \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355024\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.01716027390169365,\n \"mc2\": 0.557340774150812,\n\
\ \"mc2_stderr\": 0.015053849366752348\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5291887793783169,\n \
\ \"acc_stderr\": 0.013748996794921798\n }\n}\n```"
repo_url: https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|arc:challenge|25_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|gsm8k|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hellaswag|10_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T22-17-18.926595.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- '**/details_harness|winogrande|5_2023-12-04T22-17-18.926595.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T22-17-18.926595.parquet'
- config_name: results
data_files:
- split: 2023_12_04T22_17_18.926595
path:
- results_2023-12-04T22-17-18.926595.parquet
- split: latest
path:
- results_2023-12-04T22-17-18.926595.parquet
---
# Dataset Card for Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-AEZAKMI-v1](https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T22:17:18.926595](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1/blob/main/results_2023-12-04T22-17-18.926595.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.733063777345197,
"acc_stderr": 0.02911576095445218,
"acc_norm": 0.7392718490739228,
"acc_norm_stderr": 0.029657906091365063,
"mc1": 0.401468788249694,
"mc1_stderr": 0.01716027390169365,
"mc2": 0.557340774150812,
"mc2_stderr": 0.015053849366752348
},
"harness|arc:challenge|25": {
"acc": 0.606655290102389,
"acc_stderr": 0.014275101465693024,
"acc_norm": 0.643344709897611,
"acc_norm_stderr": 0.01399805690262019
},
"harness|hellaswag|10": {
"acc": 0.6422027484564827,
"acc_stderr": 0.004783723798286501,
"acc_norm": 0.8430591515634336,
"acc_norm_stderr": 0.0036300159898964017
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549912,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549912
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.025757559893106737,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.025757559893106737
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093278,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093278
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7404255319148936,
"acc_stderr": 0.028659179374292326,
"acc_norm": 0.7404255319148936,
"acc_norm_stderr": 0.028659179374292326
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6507936507936508,
"acc_stderr": 0.024552292209342658,
"acc_norm": 0.6507936507936508,
"acc_norm_stderr": 0.024552292209342658
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103453,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103453
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781657,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781657
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295136,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295136
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7769230769230769,
"acc_stderr": 0.02110773012724401,
"acc_norm": 0.7769230769230769,
"acc_norm_stderr": 0.02110773012724401
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227638,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227638
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8277310924369747,
"acc_stderr": 0.024528664971305424,
"acc_norm": 0.8277310924369747,
"acc_norm_stderr": 0.024528664971305424
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603396,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.012150743719481693,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.012150743719481693
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.032847388576472056,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.032847388576472056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868837,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868837
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869621,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869621
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.0283116014414386,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.0283116014414386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881347,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881347
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401854,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401854
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.896551724137931,
"acc_stderr": 0.010890452544691499,
"acc_norm": 0.896551724137931,
"acc_norm_stderr": 0.010890452544691499
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8063583815028902,
"acc_stderr": 0.021274230317515557,
"acc_norm": 0.8063583815028902,
"acc_norm_stderr": 0.021274230317515557
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7027932960893855,
"acc_stderr": 0.0152853133536416,
"acc_norm": 0.7027932960893855,
"acc_norm_stderr": 0.0152853133536416
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880945,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880945
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.02118589361522516,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.02118589361522516
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5921985815602837,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.5921985815602837,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5814863102998696,
"acc_stderr": 0.012599505608336477,
"acc_norm": 0.5814863102998696,
"acc_norm_stderr": 0.012599505608336477
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.016729937565537558,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.016729937565537558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.023420972069166344,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.023420972069166344
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.02116621630465939,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.02116621630465939
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355024,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355024
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.01716027390169365,
"mc2": 0.557340774150812,
"mc2_stderr": 0.015053849366752348
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.5291887793783169,
"acc_stderr": 0.013748996794921798
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.