datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
vitaliy-sharandin/climate-world-region | ---
dataset_info:
features:
- name: Entity
dtype: string
- name: Seasonal variation
dtype: float64
- name: Combined measurements
dtype: float64
- name: Monthly averaged
dtype: float64
- name: Annual averaged
dtype: float64
- name: monthly_sea_surface_temperature_anomaly
dtype: float64
- name: Sea surface temp (lower-bound)
dtype: float64
- name: Sea surface temp (upper-bound)
dtype: float64
- name: Monthly pH measurement
dtype: float64
- name: Annual average
dtype: float64
- name: Temperature anomaly
dtype: float64
- name: Church & White
dtype: float64
- name: University of Hawaii
dtype: float64
- name: Average
dtype: float64
- name: arctic_sea_ice_osisaf
dtype: float64
- name: Monthly averaged.1
dtype: float64
- name: Annual averaged.1
dtype: float64
- name: Monthly averaged.2
dtype: float64
- name: Annual averaged.2
dtype: float64
- name: Date
dtype: timestamp[ns, tz=UTC]
- name: dt
dtype: timestamp[ns, tz=UTC]
splits:
- name: train
num_bytes: 1813733
num_examples: 10198
download_size: 450942
dataset_size: 1813733
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "climate-world-region"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrewatef/llama2-llm | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 148923611.0
num_examples: 516177
download_size: 81796375
dataset_size: 148923611.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2-llm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/train_free_5 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604800584
num_examples: 10000
download_size: 1456470291
dataset_size: 9604800584
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kinnews_kirnews | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- rn
- rw
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- topic-classification
paperswithcode_id: kinnews-and-kirnews
pretty_name: KinnewsKirnews
dataset_info:
- config_name: kinnews_raw
features:
- name: label
dtype:
class_label:
names:
'0': politics
'1': sport
'2': economy
'3': health
'4': entertainment
'5': history
'6': technology
'7': tourism
'8': culture
'9': fashion
'10': religion
'11': environment
'12': education
'13': relationship
- name: kin_label
dtype: string
- name: en_label
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 38316546
num_examples: 17014
- name: test
num_bytes: 11971938
num_examples: 4254
download_size: 27377755
dataset_size: 50288484
- config_name: kinnews_cleaned
features:
- name: label
dtype:
class_label:
names:
'0': politics
'1': sport
'2': economy
'3': health
'4': entertainment
'5': history
'6': technology
'7': tourism
'8': culture
'9': fashion
'10': religion
'11': environment
'12': education
'13': relationship
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 32780382
num_examples: 17014
- name: test
num_bytes: 8217453
num_examples: 4254
download_size: 27377755
dataset_size: 40997835
- config_name: kirnews_raw
features:
- name: label
dtype:
class_label:
names:
'0': politics
'1': sport
'2': economy
'3': health
'4': entertainment
'5': history
'6': technology
'7': tourism
'8': culture
'9': fashion
'10': religion
'11': environment
'12': education
'13': relationship
- name: kir_label
dtype: string
- name: en_label
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 7343223
num_examples: 3689
- name: test
num_bytes: 2499189
num_examples: 923
download_size: 5186111
dataset_size: 9842412
- config_name: kirnews_cleaned
features:
- name: label
dtype:
class_label:
names:
'0': politics
'1': sport
'2': economy
'3': health
'4': entertainment
'5': history
'6': technology
'7': tourism
'8': culture
'9': fashion
'10': religion
'11': environment
'12': education
'13': relationship
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 6629767
num_examples: 3689
- name: test
num_bytes: 1570745
num_examples: 923
download_size: 5186111
dataset_size: 8200512
config_names:
- kinnews_cleaned
- kinnews_raw
- kirnews_cleaned
- kirnews_raw
---
# Dataset Card for kinnews_kirnews
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [More Information Needed]
- **Repository:** https://github.com/Andrews2017/KINNEWS-and-KIRNEWS-Corpus
- **Paper:** [KINNEWS and KIRNEWS: Benchmarking Cross-Lingual Text Classification for Kinyarwanda and Kirundi](https://arxiv.org/abs/2010.12174)
- **Leaderboard:** NA
- **Point of Contact:** [Rubungo Andre Niyongabo1](mailto:niyongabor.andre@std.uestc.edu.cn)
### Dataset Summary
Kinyarwanda and Kirundi news classification datasets (KINNEWS and KIRNEWS,respectively), which were both collected from Rwanda and Burundi news websites and newspapers, for low-resource monolingual and cross-lingual multiclass classification tasks.
### Supported Tasks and Leaderboards
This dataset can be used for text classification of news articles in Kinyarwadi and Kirundi languages. Each news article can be classified into one of the 14 possible classes. The classes are:
- politics
- sport
- economy
- health
- entertainment
- history
- technology
- culture
- religion
- environment
- education
- relationship
### Languages
Kinyarwanda and Kirundi
## Dataset Structure
### Data Instances
Here is an example from the dataset:
| Field | Value |
| ----- | ----------- |
| label | 1 |
| kin_label/kir_label | 'inkino' |
| url | 'https://nawe.bi/Primus-Ligue-Imirwi-igiye-guhura-gute-ku-ndwi-ya-6-y-ihiganwa.html' |
| title | 'Primus Ligue\xa0: Imirwi igiye guhura gute ku ndwi ya 6 y’ihiganwa\xa0?'|
| content | ' Inkino zitegekanijwe kuruno wa gatandatu igenekerezo rya 14 Nyakanga umwaka wa 2019...'|
| en_label| 'sport'|
### Data Fields
The raw version of the data for Kinyarwanda language consists of these fields
- label: The category of the news article
- kin_label/kir_label: The associated label in Kinyarwanda/Kirundi language
- en_label: The associated label in English
- url: The URL of the news article
- title: The title of the news article
- content: The content of the news article
The cleaned version contains only the `label`, `title` and the `content` fields
### Data Splits
Lang| Train | Test |
|---| ----- | ---- |
|Kinyarwandai Raw|17014|4254|
|Kinyarwandai Clean|17014|4254|
|Kirundi Raw|3689|923|
|Kirundi Clean|3689|923|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@article{niyongabo2020kinnews,
title={KINNEWS and KIRNEWS: Benchmarking Cross-Lingual Text Classification for Kinyarwanda and Kirundi},
author={Niyongabo, Rubungo Andre and Qu, Hong and Kreutzer, Julia and Huang, Li},
journal={arXiv preprint arXiv:2010.12174},
year={2020}
}
```
### Contributions
Thanks to [@saradhix](https://github.com/saradhix) for adding this dataset. |
AdapterOcean/med_alpaca_standardized_cluster_21 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 97464141
num_examples: 10378
download_size: 27330578
dataset_size: 97464141
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nekofura/AK-Dataset | ---
license: openrail
---
|
zolak/twitter_dataset_81_1713137780 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 298113
num_examples: 745
download_size: 154144
dataset_size: 298113
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adriana98/medical_spanish_pytorch | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 25360427.0
num_examples: 167
- name: test
num_bytes: 6715699.0
num_examples: 42
download_size: 31836958
dataset_size: 32076126.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-project-ml6team__cnn_dailymail_nl-612d6c13-12185622 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ml6team/cnn_dailymail_nl
eval_info:
task: summarization
model: yhavinga/mt5-base-cnn-nl
metrics: []
dataset_name: ml6team/cnn_dailymail_nl
dataset_config: default
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: yhavinga/mt5-base-cnn-nl
* Dataset: ml6team/cnn_dailymail_nl
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@yhavinga](https://huggingface.co/yhavinga) for evaluating this model. |
Dippi9845/xsum-with_fragments | ---
license: apache-2.0
---
|
mii-llm/code-ita-dpo-small | ---
dataset_info:
features:
- name: input
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 2280369
num_examples: 609
download_size: 1090146
dataset_size: 2280369
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code-instructions-ita-dpo-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__guess-vi-f50546-2087567167 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: bigscience/bloomz-1b7
metrics: []
dataset_name: futin/guess
dataset_config: vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloomz-1b7
* Dataset: futin/guess
* Config: vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
samkenxstream/turnkey-triumph-326606_SamKenX-imdb | ---
license: bsl-1.0
task_categories:
- token-classification
- text-classification
language:
- aa
- an
- av
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-OpenOrca_5w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-OpenOrca_5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T01:39:25.710626](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w/blob/main/results_2023-10-15T01-39-25.710626.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016988255033557047,\n\
\ \"em_stderr\": 0.0013234068882109725,\n \"f1\": 0.07921036073825467,\n\
\ \"f1_stderr\": 0.0018349450446225916,\n \"acc\": 0.4501236556598269,\n\
\ \"acc_stderr\": 0.010366021206293671\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.016988255033557047,\n \"em_stderr\": 0.0013234068882109725,\n\
\ \"f1\": 0.07921036073825467,\n \"f1_stderr\": 0.0018349450446225916\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \
\ \"acc_stderr\": 0.009041108602874676\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712666\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T01_39_25.710626
path:
- '**/details_harness|drop|3_2023-10-15T01-39-25.710626.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T01-39-25.710626.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T01_39_25.710626
path:
- '**/details_harness|gsm8k|5_2023-10-15T01-39-25.710626.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T01-39-25.710626.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T01_39_25.710626
path:
- '**/details_harness|winogrande|5_2023-10-15T01-39-25.710626.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T01-39-25.710626.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- results_2023-08-29T20:46:12.549567.parquet
- split: 2023_10_15T01_39_25.710626
path:
- results_2023-10-15T01-39-25.710626.parquet
- split: latest
path:
- results_2023-10-15T01-39-25.710626.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-OpenOrca_5w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-OpenOrca_5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T01:39:25.710626](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w/blob/main/results_2023-10-15T01-39-25.710626.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.016988255033557047,
"em_stderr": 0.0013234068882109725,
"f1": 0.07921036073825467,
"f1_stderr": 0.0018349450446225916,
"acc": 0.4501236556598269,
"acc_stderr": 0.010366021206293671
},
"harness|drop|3": {
"em": 0.016988255033557047,
"em_stderr": 0.0013234068882109725,
"f1": 0.07921036073825467,
"f1_stderr": 0.0018349450446225916
},
"harness|gsm8k|5": {
"acc": 0.12282031842304776,
"acc_stderr": 0.009041108602874676
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712666
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
allandclive/LugandaSoloSpeech_1K | ---
task_categories:
- automatic-speech-recognition
language:
- lg
size_categories:
- 100K<n<1M
---
# LugandaSoloSpeech1K
1,000+ Hours of single-speaker(s) Unlabeled Luganda Speech Dataset. Perfect for Speech-To-Text / ASR.
Audio quality varies from good to noisy & background music.
## Dataset Details
Format: MP3, Mono, 64kbps, 16KHz
Size: 42GB
### Data Sources
Radio shows, Youtube
|
autoevaluate/autoeval-eval-phpthinh__exampleem-filter-918293-1728760346 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/exampleem
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b1
metrics: []
dataset_name: phpthinh/exampleem
dataset_config: filter
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b1
* Dataset: phpthinh/exampleem
* Config: filter
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
open-llm-leaderboard/details_roneneldan__TinyStories-33M | ---
pretty_name: Evaluation run of roneneldan/TinyStories-33M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_roneneldan__TinyStories-33M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T05:35:11.802678](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-33M/blob/main/results_2023-09-23T05-35-11.802678.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0003145973154362416,\n\
\ \"em_stderr\": 0.0001816137946884096,\n \"f1\": 0.001937919463087248,\n\
\ \"f1_stderr\": 0.0003031702602652814,\n \"acc\": 0.24546172059984214,\n\
\ \"acc_stderr\": 0.007025085047248846\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0003145973154362416,\n \"em_stderr\": 0.0001816137946884096,\n\
\ \"f1\": 0.001937919463087248,\n \"f1_stderr\": 0.0003031702602652814\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4909234411996843,\n\
\ \"acc_stderr\": 0.014050170094497692\n }\n}\n```"
repo_url: https://huggingface.co/roneneldan/TinyStories-33M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T05_35_11.802678
path:
- '**/details_harness|drop|3_2023-09-23T05-35-11.802678.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T05-35-11.802678.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T05_35_11.802678
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-35-11.802678.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T05-35-11.802678.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:19.766363.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:32:19.766363.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T05_35_11.802678
path:
- '**/details_harness|winogrande|5_2023-09-23T05-35-11.802678.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T05-35-11.802678.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_32_19.766363
path:
- results_2023-07-19T13:32:19.766363.parquet
- split: 2023_09_23T05_35_11.802678
path:
- results_2023-09-23T05-35-11.802678.parquet
- split: latest
path:
- results_2023-09-23T05-35-11.802678.parquet
---
# Dataset Card for Evaluation run of roneneldan/TinyStories-33M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/roneneldan/TinyStories-33M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_roneneldan__TinyStories-33M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T05:35:11.802678](https://huggingface.co/datasets/open-llm-leaderboard/details_roneneldan__TinyStories-33M/blob/main/results_2023-09-23T05-35-11.802678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0003145973154362416,
"em_stderr": 0.0001816137946884096,
"f1": 0.001937919463087248,
"f1_stderr": 0.0003031702602652814,
"acc": 0.24546172059984214,
"acc_stderr": 0.007025085047248846
},
"harness|drop|3": {
"em": 0.0003145973154362416,
"em_stderr": 0.0001816137946884096,
"f1": 0.001937919463087248,
"f1_stderr": 0.0003031702602652814
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4909234411996843,
"acc_stderr": 0.014050170094497692
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
casey-martin/protocols_io | ---
license: apache-2.0
---
|
HuggingFaceM4/DocumentVQA | ---
dataset_info:
features:
- name: questionId
dtype: int32
- name: question
dtype: string
- name: question_types
list: string
- name: image
dtype: image
- name: docId
dtype: int32
- name: ucsf_document_id
dtype: string
- name: ucsf_document_page_no
dtype: string
- name: answers
list: string
splits:
- name: train
num_bytes: 5658303093.631
num_examples: 39463
- name: validation
num_bytes: 2532362556.066
num_examples: 5349
- name: test
num_bytes: 2500321215.732
num_examples: 5188
download_size: 9591606021
dataset_size: 10690986865.428999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
maghwa/10k_prompts_ranked_arabic | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: quality
list:
- name: status
dtype: string
- name: user_id
dtype: string
- name: value
dtype: string
- name: metadata
dtype: string
- name: avg_rating
dtype: float64
- name: num_responses
dtype: int64
- name: agreement_ratio
dtype: float64
- name: raw_responses
sequence: int64
- name: kind
dtype: string
splits:
- name: train
num_bytes: 10601581
num_examples: 10331
download_size: 4323538
dataset_size: 10601581
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mcanoglu/defect-detection | ---
license: mit
---
A dataset containing safe and vulnerable code to fine-tune a llm for defect detection.
The data is extracted from the wonderful work in the [CVEFixes](https://github.com/secureIT-project/CVEfixes) repository.
Citation:
```
@inproceedings{bhandari2021:cvefixes,
title = {{CVEfixes: Automated Collection of Vulnerabilities and Their Fixes from Open-Source Software}},
booktitle = {{Proceedings of the 17th International Conference on Predictive Models and Data Analytics in Software Engineering (PROMISE '21)}},
author = {Bhandari, Guru and Naseer, Amara and Moonen, Leon},
year = {2021},
pages = {10},
publisher = {{ACM}},
doi = {10.1145/3475960.3475985},
copyright = {Open Access},
isbn = {978-1-4503-8680-7},
language = {en}
}
``` |
Swapnil949/nli_multimodal_ds | ---
dataset_info:
features:
- name: index
dtype: int64
- name: description
dtype: string
- name: scenario
dtype: image
- name: synthetic_scenario
dtype: image
- name: original_scenario
dtype: image
- name: original_scenario_marked
dtype: image
- name: answer
dtype: string
splits:
- name: train
num_bytes: 308486688.8
num_examples: 1400
download_size: 5970748
dataset_size: 308486688.8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
allenai/sci-sentences-10k | ---
dataset_info:
features:
- name: sentences
sequence: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 59173993
num_examples: 5341
- name: validation
num_bytes: 19785500
num_examples: 1780
- name: test
num_bytes: 19907809
num_examples: 1781
download_size: 12078848
dataset_size: 98867302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_drop_aux_have | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 14511
num_examples: 68
- name: test
num_bytes: 14180
num_examples: 59
- name: train
num_bytes: 62821
num_examples: 251
download_size: 72112
dataset_size: 91512
---
# Dataset Card for "MULTI_VALUE_stsb_drop_aux_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Francesco/tweeter-posts | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': tweeter-posts
'1': caption
'2': tweet
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: tweeter-posts
tags:
- rf100
---
# Dataset Card for tweeter-posts
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/tweeter-posts
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
tweeter-posts
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/tweeter-posts
### Citation Information
```
@misc{ tweeter-posts,
title = { tweeter posts Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/tweeter-posts } },
url = { https://universe.roboflow.com/object-detection/tweeter-posts },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
distilled-one-sec-cv12-each-chunk-uniq/chunk_74 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1284247076.0
num_examples: 250243
download_size: 1315143122
dataset_size: 1284247076.0
---
# Dataset Card for "chunk_74"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
masakhane/african-translated-alpaca | ---
license: cc-by-nc-4.0
language:
- af
- am
- ar
- en
- ee
- fr
size_categories:
- 100K<n<1M
---
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
Please cite the [stanford_alpaca project](https://github.com/tatsu-lab/stanford_alpaca)
```
@misc{alpaca,
author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
title = {Stanford Alpaca: An Instruction-following LLaMA model},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}
``` |
Tristan/olm-october-2022-with-bookcorpus-tokenized-1024 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 88003461204
num_examples: 14295559
download_size: 23317060365
dataset_size: 88003461204
---
# Dataset Card for "olm-october-2022-with-bookcorpus-tokenized-1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712804468 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21103
num_examples: 48
download_size: 12220
dataset_size: 21103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712804468"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reciprocate/dpo_untoxic | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1005985
num_examples: 541
download_size: 485334
dataset_size: 1005985
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/physics_dataset_standardized_cluster_1_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13048987
num_examples: 4356
download_size: 0
dataset_size: 13048987
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_cluster_1_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mt0rm0/movie_descriptors | ---
license: cc0-1.0
task_categories:
- sentence-similarity
language:
- en
pretty_name: '"Movie descriptors for Semantic Search"'
size_categories:
- 10K<n<100K
tags:
- movies
- embeddings
- semantic search
- films
- hpi
- workshop
---
# Dataset Card
This dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for every film in the original dataset that has that information complete.
It is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.
This dataset has a smaller version [here](https://huggingface.co/datasets/mt0rm0/movie_descriptors_small).
## Dataset Details
### Dataset Description
The dataset has 44435 rows and 3 columns:
- 'name': includes the title of the movies
- 'release_year': indicates the year of release
- 'overview': provides a brief description of each movie, used for advertisement.
**Curated by:** [Mario Tormo Romero](https://huggingface.co/mt0rm0)
**Language(s) (NLP):** English
**License:** cc0-1.0
### Dataset Sources
This Dataset is a subset of Kaggle's [The Movie Dataset](https://www.kaggle.com/datasets/rounakbanik/the-movies-dataset).
We have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete.
The original Dataset has a cc0-1.0 License and we have maintained it.
## Uses
This is a toy dataset created for pegagogical purposes, and is used in the **Working with embeddings** Workshop created and organized by the [AI Service Center Berlin-Brandenburg](https://hpi.de/kisz/) at the [Hasso Plattner Institute](https://hpi.de/).
## Dataset Creation
### Curation Rationale
We want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets that contain just way too much information.
### Source Data
Our source is Kaggle's The Movie Dataset., so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017.
#### Data Collection and Processing
The data was downloaded from [Kaggle](https://www.kaggle.com/datasets/rounakbanik/the-movies-dataset) as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted.
The data was processed with the following code:
```python
import pandas as pd
# load the csv file
df = pd.read_csv("movies_metadata.csv", low_memory=False)
# select the required columns, drop rows with missing values and
# reset the index
df = df.loc[:, ['title', 'release_date', 'overview']]
df = df.dropna(axis=0).reset_index(drop=True)
# make a new column with the release year
df.loc[:, 'release_year'] = pd.to_datetime(df.release_date).dt.year
# select the columns in the desired order
df = df.loc[:, ['title', 'release_year', 'overview']]
# save the data to parquet
df.to_parquet('descriptors_data.parquet')
```
#### Who are the source data producers?
This dataset is an ensemble of data collected by [Rounak Banik](https://www.kaggle.com/rounakbanik) from TMDB and GroupLens.
In particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb. |
FanjiaYan/CS182-DreamBooth-dataset | ---
license: apache-2.0
---
|
datadrivenscience/movie-genre-prediction | ---
dataset_info:
features:
- name: id
dtype: int64
- name: movie_name
dtype: string
- name: synopsis
dtype: string
- name: genre
dtype: string
splits:
- name: train
num_bytes: 10488729
num_examples: 54000
- name: test
num_bytes: 6965864
num_examples: 36000
download_size: 11902232
dataset_size: 17454593
---
# Dataset Card for Movie Genre Prediction
Link to [Movie Genre Prediction Competition](https://huggingface.co/spaces/competitions/movie-genre-prediction)
By accessing this dataset, you accept the rules of the Movie Genre Prediction competition.
# Organizer
Organizer of this competition is [Data-Driven Science](https://datadrivenscience.com/).
[Join our FREE 3-Day Object Detection Challenge!](https://datadrivenscience.com/free-object-detection-challenge/)
<img src="https://datadrivenscience.com/wp-content/uploads/2022/12/DDS-Logo.png" width="200" height="100">
# Email Usage
By accessing this dataset, you consent that your email will be used for communication purposes from Data-Driven Science.
We do not share nor sell our mailing list. Your information remains confidential. You may unsubscribe at any time.
|
open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-loss-100000 | ---
pretty_name: Evaluation run of NLUHOPOE/Mistral-7B-loss-100000
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/Mistral-7B-loss-100000](https://huggingface.co/NLUHOPOE/Mistral-7B-loss-100000)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-loss-100000\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T10:12:29.042684](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-loss-100000/blob/main/results_2024-01-26T10-12-29.042684.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5369263603517691,\n\
\ \"acc_stderr\": 0.033848622931727725,\n \"acc_norm\": 0.5429723385562029,\n\
\ \"acc_norm_stderr\": 0.03460142992361602,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.4092562147054596,\n\
\ \"mc2_stderr\": 0.014606886822140043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47525597269624575,\n \"acc_stderr\": 0.01459348769493774,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490973\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5760804620593507,\n\
\ \"acc_stderr\": 0.004931679059919374,\n \"acc_norm\": 0.7715594503087034,\n\
\ \"acc_norm_stderr\": 0.004189698894885502\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776296,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776296\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6225806451612903,\n \"acc_stderr\": 0.027575960723278246,\n \"\
acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.027575960723278246\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615486,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615486\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817223,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817223\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7211009174311926,\n\
\ \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\": 0.7211009174311926,\n\
\ \"acc_norm_stderr\": 0.0192274688764635\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252335,\n\
\ \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891824,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7126436781609196,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.7126436781609196,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31508379888268156,\n\
\ \"acc_stderr\": 0.015536850852473636,\n \"acc_norm\": 0.31508379888268156,\n\
\ \"acc_norm_stderr\": 0.015536850852473636\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4074315514993481,\n\
\ \"acc_stderr\": 0.012549473714212223,\n \"acc_norm\": 0.4074315514993481,\n\
\ \"acc_norm_stderr\": 0.012549473714212223\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.031512360446742695,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.031512360446742695\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014638,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014638\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.4092562147054596,\n\
\ \"mc2_stderr\": 0.014606886822140043\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483667\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18574677786201668,\n \
\ \"acc_stderr\": 0.01071229890272908\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/Mistral-7B-loss-100000
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|arc:challenge|25_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|arc:challenge|25_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|gsm8k|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|gsm8k|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hellaswag|10_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hellaswag|10_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-00-16.807578.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-12-29.042684.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T10-12-29.042684.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- '**/details_harness|winogrande|5_2024-01-26T10-00-16.807578.parquet'
- split: 2024_01_26T10_12_29.042684
path:
- '**/details_harness|winogrande|5_2024-01-26T10-12-29.042684.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T10-12-29.042684.parquet'
- config_name: results
data_files:
- split: 2024_01_26T10_00_16.807578
path:
- results_2024-01-26T10-00-16.807578.parquet
- split: 2024_01_26T10_12_29.042684
path:
- results_2024-01-26T10-12-29.042684.parquet
- split: latest
path:
- results_2024-01-26T10-12-29.042684.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/Mistral-7B-loss-100000
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/Mistral-7B-loss-100000](https://huggingface.co/NLUHOPOE/Mistral-7B-loss-100000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-loss-100000",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T10:12:29.042684](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-loss-100000/blob/main/results_2024-01-26T10-12-29.042684.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5369263603517691,
"acc_stderr": 0.033848622931727725,
"acc_norm": 0.5429723385562029,
"acc_norm_stderr": 0.03460142992361602,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.4092562147054596,
"mc2_stderr": 0.014606886822140043
},
"harness|arc:challenge|25": {
"acc": 0.47525597269624575,
"acc_stderr": 0.01459348769493774,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490973
},
"harness|hellaswag|10": {
"acc": 0.5760804620593507,
"acc_stderr": 0.004931679059919374,
"acc_norm": 0.7715594503087034,
"acc_norm_stderr": 0.004189698894885502
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776296,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776296
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278246,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278246
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036810508691615486,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036810508691615486
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817223,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817223
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.0192274688764635,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.0192274688764635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891824,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891824
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7126436781609196,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.7126436781609196,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31508379888268156,
"acc_stderr": 0.015536850852473636,
"acc_norm": 0.31508379888268156,
"acc_norm_stderr": 0.015536850852473636
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4074315514993481,
"acc_stderr": 0.012549473714212223,
"acc_norm": 0.4074315514993481,
"acc_norm_stderr": 0.012549473714212223
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.031512360446742695,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.031512360446742695
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.4092562147054596,
"mc2_stderr": 0.014606886822140043
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483667
},
"harness|gsm8k|5": {
"acc": 0.18574677786201668,
"acc_stderr": 0.01071229890272908
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vineetkukreti25/gdfgfdgdfg | ---
license: mit
---
|
lokesh2002/construction_sample_dataset1 | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: ' text'
dtype: string
splits:
- name: train
num_bytes: 4214025.0
num_examples: 10
download_size: 4162297
dataset_size: 4214025.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepak-newzera/spectrogram_data_max_music_dataset-1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 705907835.375
num_examples: 15949
download_size: 703640547
dataset_size: 705907835.375
---
# Dataset Card for "spectrogram_data_max_music_dataset-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rosenberg/webnlg | ---
license: mit
---
|
aasd291809733/myself | ---
license: apache-2.0
---
|
arieg/bw_spec_cls_80_03 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '6342'
'1': '6354'
'2': '6357'
'3': '6360'
'4': '6366'
'5': '6367'
'6': '6370'
'7': '6373'
'8': '6379'
'9': '6381'
'10': '6382'
'11': '6383'
'12': '6387'
'13': '6389'
'14': '6393'
'15': '6394'
'16': '6396'
'17': '6406'
'18': '6407'
'19': '6439'
'20': '6440'
'21': '6442'
'22': '6443'
'23': '6448'
'24': '6459'
'25': '6469'
'26': '6517'
'27': '6519'
'28': '6603'
'29': '6605'
'30': '6606'
'31': '6607'
'32': '6608'
'33': '6609'
'34': '6610'
'35': '6611'
'36': '6674'
'37': '6675'
'38': '6677'
'39': '6679'
'40': '6680'
'41': '6684'
'42': '6776'
'43': '6778'
'44': '6788'
'45': '6802'
'46': '6803'
'47': '6854'
'48': '6855'
'49': '6856'
'50': '6857'
'51': '7481'
'52': '7482'
'53': '7483'
'54': '7487'
'55': '7488'
'56': '7489'
'57': '7490'
'58': '7491'
'59': '7492'
'60': '7495'
'61': '7526'
'62': '7527'
'63': '7528'
'64': '7529'
'65': '7548'
'66': '7554'
'67': '7709'
'68': '7710'
'69': '7711'
'70': '7712'
'71': '7713'
'72': '7872'
'73': '8056'
'74': '8208'
'75': '8256'
'76': '8259'
'77': '8261'
'78': '8345'
'79': '8357'
splits:
- name: train
num_bytes: 89110585.6
num_examples: 1600
download_size: 88647318
dataset_size: 89110585.6
---
# Dataset Card for "bw_spec_cls_80_03"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mickylan2367/LoadingScriptPractice | ---
license: cc-by-sa-4.0
language:
- en
tags:
- music
---
* HuggingfaceのAPIを利用したloading Scriptを試すための練習リポジトリです。
* データの内容は、<a href="https://huggingface.co/datasets/mickylan2367/GraySpectrogram2">mickylan2367/GraySpectrogram</a>とほぼ同じです
|
Madiator2011/lyoko-dataset | ---
license: mit
task_categories:
- question-answering
language:
- en
pretty_name: Lyoko Wiki
--- |
EduardoPacheco/gpt4v-LAION-discord | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
- name: link
dtype: string
- name: message_id
dtype: string
- name: timestamp
dtype: string
splits:
- name: train
num_bytes: 36014887.0
num_examples: 136
download_size: 0
dataset_size: 36014887.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gpt4v-LAION-discord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_first_sent_train_100_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 267331
num_examples: 210
- name: validation
num_bytes: 10399
num_examples: 10
download_size: 135617
dataset_size: 277730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_first_sent_train_100_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GbotHQ/private-dataset-test | ---
license: openrail
---
|
open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-13b-hf-gpt-4-80k | ---
pretty_name: Evaluation run of XuanXuanXuanXuan/Llama-2-13b-hf-gpt-4-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [XuanXuanXuanXuan/Llama-2-13b-hf-gpt-4-80k](https://huggingface.co/XuanXuanXuanXuan/Llama-2-13b-hf-gpt-4-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-13b-hf-gpt-4-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T14:11:38.492802](https://huggingface.co/datasets/open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-13b-hf-gpt-4-80k/blob/main/results_2024-03-21T14-11-38.492802.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5541047461625745,\n\
\ \"acc_stderr\": 0.03365635356205185,\n \"acc_norm\": 0.5606137329142736,\n\
\ \"acc_norm_stderr\": 0.034375449118507075,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.4982954224425231,\n\
\ \"mc2_stderr\": 0.015387959222659859\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.01426412212493822\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5999800836486756,\n\
\ \"acc_stderr\": 0.004889007921214698,\n \"acc_norm\": 0.7988448516231826,\n\
\ \"acc_norm_stderr\": 0.004000445083522541\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.032025630761017346,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.032025630761017346\n \
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871137,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871137\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.02659308451657227,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.02659308451657227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.03274287914026868,\n \"acc_norm\"\
: 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026868\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.01890416417151019,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.01890416417151019\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404032,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404032\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n\
\ \"acc_stderr\": 0.01559495538445576,\n \"acc_norm\": 0.7445721583652618,\n\
\ \"acc_norm_stderr\": 0.01559495538445576\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584197,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584197\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n\
\ \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n\
\ \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302898,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.02682280175950789,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.02682280175950789\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41590612777053454,\n\
\ \"acc_stderr\": 0.012588323850313625,\n \"acc_norm\": 0.41590612777053454,\n\
\ \"acc_norm_stderr\": 0.012588323850313625\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275668,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5392156862745098,\n \"acc_stderr\": 0.0201655233139079,\n \
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.0201655233139079\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.4982954224425231,\n\
\ \"mc2_stderr\": 0.015387959222659859\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893127\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2175890826383624,\n \
\ \"acc_stderr\": 0.011365231761189584\n }\n}\n```"
repo_url: https://huggingface.co/XuanXuanXuanXuan/Llama-2-13b-hf-gpt-4-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-11-38.492802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-11-38.492802.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- '**/details_harness|winogrande|5_2024-03-21T14-11-38.492802.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T14-11-38.492802.parquet'
- config_name: results
data_files:
- split: 2024_03_21T14_11_38.492802
path:
- results_2024-03-21T14-11-38.492802.parquet
- split: latest
path:
- results_2024-03-21T14-11-38.492802.parquet
---
# Dataset Card for Evaluation run of XuanXuanXuanXuan/Llama-2-13b-hf-gpt-4-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [XuanXuanXuanXuan/Llama-2-13b-hf-gpt-4-80k](https://huggingface.co/XuanXuanXuanXuan/Llama-2-13b-hf-gpt-4-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-13b-hf-gpt-4-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T14:11:38.492802](https://huggingface.co/datasets/open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-13b-hf-gpt-4-80k/blob/main/results_2024-03-21T14-11-38.492802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5541047461625745,
"acc_stderr": 0.03365635356205185,
"acc_norm": 0.5606137329142736,
"acc_norm_stderr": 0.034375449118507075,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.4982954224425231,
"mc2_stderr": 0.015387959222659859
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.01426412212493822
},
"harness|hellaswag|10": {
"acc": 0.5999800836486756,
"acc_stderr": 0.004889007921214698,
"acc_norm": 0.7988448516231826,
"acc_norm_stderr": 0.004000445083522541
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.032025630761017346,
"acc_norm": 0.4,
"acc_norm_stderr": 0.032025630761017346
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871137,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871137
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.02659308451657227,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.02659308451657227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03274287914026868,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03274287914026868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.01890416417151019,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.01890416417151019
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404032,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404032
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.01559495538445576,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.01559495538445576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584197,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584197
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302898,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.02682280175950789,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.02682280175950789
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41590612777053454,
"acc_stderr": 0.012588323850313625,
"acc_norm": 0.41590612777053454,
"acc_norm_stderr": 0.012588323850313625
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275668,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.0201655233139079,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.0201655233139079
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.4982954224425231,
"mc2_stderr": 0.015387959222659859
},
"harness|winogrande|5": {
"acc": 0.728492501973165,
"acc_stderr": 0.012499326254893127
},
"harness|gsm8k|5": {
"acc": 0.2175890826383624,
"acc_stderr": 0.011365231761189584
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mjphayes/elpv-augmented | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: class
dtype: int64
- name: type
dtype: string
splits:
- name: train
num_bytes: 138172974.72
num_examples: 4416
- name: validation
num_bytes: 13534024.0
num_examples: 394
- name: test
num_bytes: 22354586.0
num_examples: 654
download_size: 191567217
dataset_size: 174061584.72
---
# Dataset Card for "elpv-augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b | ---
pretty_name: Evaluation run of totally-not-an-llm/PuddleJumper-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [totally-not-an-llm/PuddleJumper-13b](https://huggingface.co/totally-not-an-llm/PuddleJumper-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T13:22:43.977787](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b/blob/main/results_2023-09-17T13-22-43.977787.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08452181208053691,\n\
\ \"em_stderr\": 0.002848708763936303,\n \"f1\": 0.20933095637583904,\n\
\ \"f1_stderr\": 0.003279820666133777,\n \"acc\": 0.38053092049715975,\n\
\ \"acc_stderr\": 0.008728490320313857\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08452181208053691,\n \"em_stderr\": 0.002848708763936303,\n\
\ \"f1\": 0.20933095637583904,\n \"f1_stderr\": 0.003279820666133777\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03335860500379075,\n \
\ \"acc_stderr\": 0.004946282649173775\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n\
\ }\n}\n```"
repo_url: https://huggingface.co/totally-not-an-llm/PuddleJumper-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|arc:challenge|25_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- '**/details_harness|drop|3_2023-09-17T13-22-43.977787.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T13-22-43.977787.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-22-43.977787.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-22-43.977787.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hellaswag|10_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T00_36_46.680857
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T00:36:46.680857.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T00:36:46.680857.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- '**/details_harness|winogrande|5_2023-09-17T13-22-43.977787.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T13-22-43.977787.parquet'
- config_name: results
data_files:
- split: 2023_09_17T13_22_43.977787
path:
- results_2023-09-17T13-22-43.977787.parquet
- split: latest
path:
- results_2023-09-17T13-22-43.977787.parquet
---
# Dataset Card for Evaluation run of totally-not-an-llm/PuddleJumper-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/totally-not-an-llm/PuddleJumper-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [totally-not-an-llm/PuddleJumper-13b](https://huggingface.co/totally-not-an-llm/PuddleJumper-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T13:22:43.977787](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b/blob/main/results_2023-09-17T13-22-43.977787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08452181208053691,
"em_stderr": 0.002848708763936303,
"f1": 0.20933095637583904,
"f1_stderr": 0.003279820666133777,
"acc": 0.38053092049715975,
"acc_stderr": 0.008728490320313857
},
"harness|drop|3": {
"em": 0.08452181208053691,
"em_stderr": 0.002848708763936303,
"f1": 0.20933095637583904,
"f1_stderr": 0.003279820666133777
},
"harness|gsm8k|5": {
"acc": 0.03335860500379075,
"acc_stderr": 0.004946282649173775
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Thouph/Laion_aesthetics_5plus_1024_33M_csv | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_decapoda-research__Antares-11b-v1 | ---
pretty_name: Evaluation run of decapoda-research/Antares-11b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [decapoda-research/Antares-11b-v1](https://huggingface.co/decapoda-research/Antares-11b-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decapoda-research__Antares-11b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T13:13:13.296577](https://huggingface.co/datasets/open-llm-leaderboard/details_decapoda-research__Antares-11b-v1/blob/main/results_2024-01-04T13-13-13.296577.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6602238370348961,\n\
\ \"acc_stderr\": 0.03146617343256451,\n \"acc_norm\": 0.6625896368336766,\n\
\ \"acc_norm_stderr\": 0.03209888448148018,\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5283649819747338,\n\
\ \"mc2_stderr\": 0.015000610527158549\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938215,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094089\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6535550687114121,\n\
\ \"acc_stderr\": 0.0047486451332815725,\n \"acc_norm\": 0.8485361481776539,\n\
\ \"acc_norm_stderr\": 0.0035776774950640926\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.031639106653672915,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.031639106653672915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.0256993528321318,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.0256993528321318\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172537,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172537\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168583,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168583\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568624,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878463,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878463\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406943,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406943\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.014149575348976266,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.014149575348976266\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263294,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263294\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008553,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008553\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.500651890482399,\n\
\ \"acc_stderr\": 0.01277022525225556,\n \"acc_norm\": 0.500651890482399,\n\
\ \"acc_norm_stderr\": 0.01277022525225556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.027033041151681456,\n\
\ \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.027033041151681456\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466115,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466115\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5283649819747338,\n\
\ \"mc2_stderr\": 0.015000610527158549\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825888\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5830174374526156,\n \
\ \"acc_stderr\": 0.013581320997216586\n }\n}\n```"
repo_url: https://huggingface.co/decapoda-research/Antares-11b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-13-13.296577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-13-13.296577.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- '**/details_harness|winogrande|5_2024-01-04T13-13-13.296577.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T13-13-13.296577.parquet'
- config_name: results
data_files:
- split: 2024_01_04T13_13_13.296577
path:
- results_2024-01-04T13-13-13.296577.parquet
- split: latest
path:
- results_2024-01-04T13-13-13.296577.parquet
---
# Dataset Card for Evaluation run of decapoda-research/Antares-11b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [decapoda-research/Antares-11b-v1](https://huggingface.co/decapoda-research/Antares-11b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_decapoda-research__Antares-11b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T13:13:13.296577](https://huggingface.co/datasets/open-llm-leaderboard/details_decapoda-research__Antares-11b-v1/blob/main/results_2024-01-04T13-13-13.296577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6602238370348961,
"acc_stderr": 0.03146617343256451,
"acc_norm": 0.6625896368336766,
"acc_norm_stderr": 0.03209888448148018,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5283649819747338,
"mc2_stderr": 0.015000610527158549
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938215,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094089
},
"harness|hellaswag|10": {
"acc": 0.6535550687114121,
"acc_stderr": 0.0047486451332815725,
"acc_norm": 0.8485361481776539,
"acc_norm_stderr": 0.0035776774950640926
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.0256993528321318,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.0256993528321318
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172537,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172537
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.015014462497168583,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.015014462497168583
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568624,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878463,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878463
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406943,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406943
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976266,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976266
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263294,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263294
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008553,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008553
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.500651890482399,
"acc_stderr": 0.01277022525225556,
"acc_norm": 0.500651890482399,
"acc_norm_stderr": 0.01277022525225556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.027033041151681456,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.027033041151681456
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466115,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5283649819747338,
"mc2_stderr": 0.015000610527158549
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825888
},
"harness|gsm8k|5": {
"acc": 0.5830174374526156,
"acc_stderr": 0.013581320997216586
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Fael2d/Voz63 | ---
license: openrail
---
|
mindchain/synth1 | ---
license: openrail
size_categories:
- 1K<n<10K
task_categories:
- feature-extraction
- text-classification
language:
- de
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Synthetischen Dataset für Instruction Finetunning
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[German]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[Frage, Antwort]
### Data Splits
[Train]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_rte_present_perfect_ever | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 271062
num_examples: 623
- name: train
num_bytes: 234505
num_examples: 497
download_size: 330203
dataset_size: 505567
---
# Dataset Card for "MULTI_VALUE_rte_present_perfect_ever"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/wiki_find_passage_train10_eval10_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 22687
num_examples: 30
- name: validation
num_bytes: 7013
num_examples: 10
download_size: 25278
dataset_size: 29700
---
# Dataset Card for "wiki_find_passage_train10_eval10_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-25118781-8365116 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- scientific_papers
eval_info:
task: summarization
model: google/bigbird-pegasus-large-pubmed
metrics: ['bertscore', 'meteor']
dataset_name: scientific_papers
dataset_config: pubmed
dataset_split: test
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-pubmed
* Dataset: scientific_papers
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise_g](https://huggingface.co/Blaise_g) for evaluating this model. |
longAtSJSU/TrainData | ---
license: other
task_categories:
- text-classification
language:
- en
pretty_name: resever
---
---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids: []
paperswithcode_id: samsum-corpus
pretty_name: SAMSum Corpus
tags:
- conversations-summarization
dataset_info:
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
config_name: samsum
splits:
- name: train
num_bytes: 9479141
num_examples: 14732
- name: test
num_bytes: 534492
num_examples: 819
- name: validation
num_bytes: 516431
num_examples: 818
download_size: 2944100
dataset_size: 10530064
train-eval-index:
- config: samsum
task: summarization
task_id: summarization
splits:
eval_split: test
col_mapping:
dialogue: text
summary: target
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jacobbieker/global-mosaic-of-geostationary-images | ---
license: mit
tags:
- climate
pretty_name: Global Mosaic of Geostationary Images
size_categories:
- 1K<n<10K
---
This dataset comprises converted version of the Global Mosaic of Geostationary Images from NOAA on AWS, which stitches together geostationary imagery every hour for a snapshot of
nearly the entire world in 5 bands, and at a 8km resolution. The original NetCDF files were converted to Zarr and concatenated along the time dimension per day, with each of the 5 bands
being included in one overall Xarray Dataset. |
scikit-learn/auto-mpg | ---
license: apache-2.0
task_categories:
- tabular-classification
- tabular-regression
language:
- en
tags:
- scikit-learn
pretty_name: auto-mpg
---
## Auto Miles per Gallon (MPG) Dataset
Following description was taken from [UCI machine learning repository](https://archive.ics.uci.edu/ml/datasets/auto+mpg).
Source: This dataset was taken from the StatLib library which is maintained at Carnegie Mellon University. The dataset was used in the 1983 American Statistical Association Exposition.
## Data Set Information:
This dataset is a slightly modified version of the dataset provided in the StatLib library. In line with the use by Ross Quinlan (1993) in predicting the attribute "mpg", 8 of the original instances were removed because they had unknown values for the "mpg" attribute. The original dataset is available in the file "auto-mpg.data-original".
"The data concerns city-cycle fuel consumption in miles per gallon, to be predicted in terms of 3 multivalued discrete and 5 continuous attributes." (Quinlan, 1993)
## Attribute Information:
- mpg: continuous
- cylinders: multi-valued discrete
- displacement: continuous
- horsepower: continuous
- weight: continuous
- acceleration: continuous
- model year: multi-valued discrete
- origin: multi-valued discrete
- car name: string (unique for each instance) |
lamini/spider_train_all_bird_text_to_sql | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 38142759
num_examples: 17962
- name: test
num_bytes: 1090039
num_examples: 1034
download_size: 4171928
dataset_size: 39232798
---
# Dataset Card for "spider_train_all_bird_text_to_sql"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v4-math-54ae93-2018366735 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v4
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-66b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v4
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v4
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-66b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v4
* Config: mathemakitten--winobias_antistereotype_test_cot_v4
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
Porya/DSB | ---
license: openrail
task_categories:
- text-generation
language:
- en
pretty_name: DSBTrain
size_categories:
- n<1K
--- |
makaveli10/shrutilipi-whisper | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 157535021960.908
num_examples: 231324
- name: validation
num_bytes: 47062936319.286
num_examples: 68882
download_size: 506481590
dataset_size: 204597958280.194
---
# Dataset Card for "shrutilipi-whisper"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AtAndDev__ShortKingv0.1 | ---
pretty_name: Evaluation run of AtAndDev/ShortKingv0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AtAndDev/ShortKingv0.1](https://huggingface.co/AtAndDev/ShortKingv0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AtAndDev__ShortKingv0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T00:55:05.543102](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKingv0.1/blob/main/results_2023-10-24T00-55-05.543102.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0050335570469798654,\n\
\ \"em_stderr\": 0.0007247385547751907,\n \"f1\": 0.054680159395973316,\n\
\ \"f1_stderr\": 0.0014128539394208607,\n \"acc\": 0.28246387417700025,\n\
\ \"acc_stderr\": 0.007901602410009655\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0050335570469798654,\n \"em_stderr\": 0.0007247385547751907,\n\
\ \"f1\": 0.054680159395973316,\n \"f1_stderr\": 0.0014128539394208607\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5603788476716653,\n \"acc_stderr\": 0.01394964977601569\n\
\ }\n}\n```"
repo_url: https://huggingface.co/AtAndDev/ShortKingv0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T00_55_05.543102
path:
- '**/details_harness|drop|3_2023-10-24T00-55-05.543102.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T00-55-05.543102.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T00_55_05.543102
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-55-05.543102.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-55-05.543102.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-59-37.972814.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-59-37.972814.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T00_55_05.543102
path:
- '**/details_harness|winogrande|5_2023-10-24T00-55-05.543102.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T00-55-05.543102.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_59_37.972814
path:
- results_2023-10-03T17-59-37.972814.parquet
- split: 2023_10_24T00_55_05.543102
path:
- results_2023-10-24T00-55-05.543102.parquet
- split: latest
path:
- results_2023-10-24T00-55-05.543102.parquet
---
# Dataset Card for Evaluation run of AtAndDev/ShortKingv0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AtAndDev/ShortKingv0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AtAndDev/ShortKingv0.1](https://huggingface.co/AtAndDev/ShortKingv0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AtAndDev__ShortKingv0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T00:55:05.543102](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKingv0.1/blob/main/results_2023-10-24T00-55-05.543102.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0050335570469798654,
"em_stderr": 0.0007247385547751907,
"f1": 0.054680159395973316,
"f1_stderr": 0.0014128539394208607,
"acc": 0.28246387417700025,
"acc_stderr": 0.007901602410009655
},
"harness|drop|3": {
"em": 0.0050335570469798654,
"em_stderr": 0.0007247385547751907,
"f1": 0.054680159395973316,
"f1_stderr": 0.0014128539394208607
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|winogrande|5": {
"acc": 0.5603788476716653,
"acc_stderr": 0.01394964977601569
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jonoomph/openshot-usage | ---
license: gpl-3.0
---
|
HuggingFaceM4/LLaVAR-Instruct-16K | ---
dataset_info:
features:
- name: image
dtype: image
- name: user_texts
sequence: string
- name: bot_texts
sequence: string
splits:
- name: train
num_bytes: 433689449.5
num_examples: 15500
download_size: 487607994
dataset_size: 433689449.5
---
# Dataset Card for "LLaVAR-Instruct-16K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anujsahani01/CodeParrot_tokenized | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 442225832
num_examples: 33986
- name: test
num_bytes: 147373912
num_examples: 11326
- name: validation
num_bytes: 288033632
num_examples: 22136
download_size: 112776043
dataset_size: 877633376
---
|
AlanRobotics/ruCoNaLa | ---
dataset_info:
features:
- name: snippet
dtype: string
- name: translated
dtype: string
splits:
- name: train
num_bytes: 1601026
num_examples: 11125
download_size: 892408
dataset_size: 1601026
---
# Dataset Card for "ruCoNaLa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kz919/open-orca-flan-50k-synthetic-reward-e5-mistral-7b-instruct-v2 | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: task
dtype: string
- name: ignos-Mistral-T5-7B-v1
dtype: string
- name: cognAI-lil-c3po
dtype: string
- name: viethq188-Rabbit-7B-DPO-Chat
dtype: string
- name: cookinai-DonutLM-v1
dtype: string
- name: v1olet-v1olet-merged-dpo-7B
dtype: string
- name: normalized_rewards
sequence: float32
- name: router_label
dtype: int64
splits:
- name: train
num_bytes: 105157970
num_examples: 50000
download_size: 48589672
dataset_size: 105157970
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zakAiDevops/noor-dtst | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Output
dtype: string
splits:
- name: train
num_bytes: 70905
num_examples: 313
download_size: 30383
dataset_size: 70905
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
McSpicyWithMilo/reference-elements-0.1split-new-move | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: reference_element
dtype: string
splits:
- name: train
num_bytes: 11267.1
num_examples: 90
- name: test
num_bytes: 1251.9
num_examples: 10
download_size: 9516
dataset_size: 12519.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "reference-elements-0.1split-new-move"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/hh-rlhf_with_features_flan_t5_large_DA_Bard_xl_zeroshot | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: human
dtype: string
- name: assistant_chosen
dtype: string
- name: assistant_rejected
dtype: string
- name: log_score_chosen
dtype: float64
- name: log_score_rejected
dtype: float64
- name: labels
dtype: string
- name: zeroshot_helpfulness_chosen
dtype: int64
- name: zeroshot_helpfulness_rejected
dtype: int64
- name: zeroshot_specificity_chosen
dtype: int64
- name: zeroshot_specificity_rejected
dtype: int64
- name: zeroshot_intent_chosen
dtype: int64
- name: zeroshot_intent_rejected
dtype: int64
- name: zeroshot_factuality_chosen
dtype: int64
- name: zeroshot_factuality_rejected
dtype: int64
- name: zeroshot_easy-to-understand_chosen
dtype: int64
- name: zeroshot_easy-to-understand_rejected
dtype: int64
- name: zeroshot_relevance_chosen
dtype: int64
- name: zeroshot_relevance_rejected
dtype: int64
- name: zeroshot_readability_chosen
dtype: int64
- name: zeroshot_readability_rejected
dtype: int64
- name: zeroshot_enough-detail_chosen
dtype: int64
- name: zeroshot_enough-detail_rejected
dtype: int64
- name: zeroshot_biased:_chosen
dtype: int64
- name: zeroshot_biased:_rejected
dtype: int64
- name: zeroshot_fail-to-consider-individual-preferences_chosen
dtype: int64
- name: zeroshot_fail-to-consider-individual-preferences_rejected
dtype: int64
- name: zeroshot_repetetive_chosen
dtype: int64
- name: zeroshot_repetetive_rejected
dtype: int64
- name: zeroshot_fail-to-consider-context_chosen
dtype: int64
- name: zeroshot_fail-to-consider-context_rejected
dtype: int64
- name: zeroshot_too-long_chosen
dtype: int64
- name: zeroshot_too-long_rejected
dtype: int64
splits:
- name: train
num_bytes: 16425816
num_examples: 9574
- name: test
num_bytes: 16369741
num_examples: 9574
download_size: 16118427
dataset_size: 32795557
---
# Dataset Card for "hh-rlhf_with_features_flan_t5_large_DA_Bard_xl_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_openlm-research__open_llama_3b | ---
pretty_name: Evaluation run of openlm-research/open_llama_3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openlm-research/open_llama_3b](https://huggingface.co/openlm-research/open_llama_3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openlm-research__open_llama_3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T03:50:40.523576](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b/blob/main/results_2023-10-18T03-50-40.523576.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n\
\ \"em_stderr\": 0.00023443780464835776,\n \"f1\": 0.050632340604026965,\n\
\ \"f1_stderr\": 0.001271781500579302,\n \"acc\": 0.32587350322198844,\n\
\ \"acc_stderr\": 0.00764164157289629\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464835776,\n\
\ \"f1\": 0.050632340604026965,\n \"f1_stderr\": 0.001271781500579302\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.01342972810178896\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openlm-research/open_llama_3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T03_50_40.523576
path:
- '**/details_harness|drop|3_2023-10-18T03-50-40.523576.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T03-50-40.523576.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T03_50_40.523576
path:
- '**/details_harness|gsm8k|5_2023-10-18T03-50-40.523576.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T03-50-40.523576.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:49.433588.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:22:49.433588.parquet'
- split: 2023_07_19T10_43_17.176281
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:43:17.176281.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:43:17.176281.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T03_50_40.523576
path:
- '**/details_harness|winogrande|5_2023-10-18T03-50-40.523576.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T03-50-40.523576.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T14:52:20.646698.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:31:38.653587.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:management|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:virology|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T14_52_20.646698
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T14:52:20.646698.parquet'
- split: 2023_08_28T20_31_38.653587
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:31:38.653587.parquet'
- split: 2023_08_29T12_36_41.239310
path:
- '**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet'
- config_name: results
data_files:
- split: 2023_07_18T11_22_49.433588
path:
- results_2023-07-18T11:22:49.433588.parquet
- split: 2023_07_19T10_43_17.176281
path:
- results_2023-07-19T10:43:17.176281.parquet
- split: 2023_08_28T14_52_20.646698
path:
- results_2023-08-28T14:52:20.646698.parquet
- split: 2023_08_28T20_31_38.653587
path:
- results_2023-08-28T20:31:38.653587.parquet
- split: 2023_08_29T12_36_41.239310
path:
- results_2023-08-29T12:36:41.239310.parquet
- split: 2023_10_18T03_50_40.523576
path:
- results_2023-10-18T03-50-40.523576.parquet
- split: latest
path:
- results_2023-10-18T03-50-40.523576.parquet
---
# Dataset Card for Evaluation run of openlm-research/open_llama_3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openlm-research/open_llama_3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openlm-research/open_llama_3b](https://huggingface.co/openlm-research/open_llama_3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openlm-research__open_llama_3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T03:50:40.523576](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b/blob/main/results_2023-10-18T03-50-40.523576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464835776,
"f1": 0.050632340604026965,
"f1_stderr": 0.001271781500579302,
"acc": 0.32587350322198844,
"acc_stderr": 0.00764164157289629
},
"harness|drop|3": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464835776,
"f1": 0.050632340604026965,
"f1_stderr": 0.001271781500579302
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.01342972810178896
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/cellica_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cellica (Fire Emblem)
This is the dataset of cellica (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, red_hair, red_eyes, breasts, earrings, bangs, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 571.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 359.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1080 | 696.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 518.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1080 | 918.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cellica_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, cape, fingerless_gloves, jewelry, looking_at_viewer, simple_background, smile, solo, tiara, armor, bare_shoulders, detached_collar, black_gloves, white_dress |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, simple_background, solo, tiara, upper_body, armor, detached_collar, jewelry, smile, white_dress, closed_mouth, cape, cleavage, medium_breasts, white_background |
| 2 | 14 |  |  |  |  |  | 1girl, cape, dress, jewelry, solo, armor, fingerless_gloves, holding_sword, simple_background, smile, tiara, black_thighhighs, looking_at_viewer, zettai_ryouiki, detached_collar, white_background, bare_shoulders, boots, full_body, black_gloves, cowboy_shot |
| 3 | 5 |  |  |  |  |  | 1girl, navel, nipples, smile, solo, completely_nude, large_breasts, looking_at_viewer, medium_breasts, pussy, collarbone, outdoors, standing, thighs, water, blush, day, jewelry, nature, wading, yellow_eyes |
| 4 | 8 |  |  |  |  |  | 1girl, jewelry, solo_focus, thighhighs, hetero, open_mouth, tiara, 1boy, blush, breasts_out, clothed_sex, cowgirl_position, cum_in_pussy, girl_on_top, nipples, penis, vaginal, armor, black_gloves, fingerless_gloves, large_breasts, spread_legs, cape, detached_collar, medium_breasts, sweat, bar_censor, dress_lift, dress_pull, looking_at_viewer, white_dress |
| 5 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, bar_censor, fellatio, nipples, jewelry, nude, blush, large_breasts, medium_breasts, testicles |
| 6 | 12 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, blush, solo_focus, sex, large_breasts, open_mouth, penis, vaginal, cum_in_pussy, navel, bar_censor, completely_nude, tiara, jewelry, sweat |
| 7 | 5 |  |  |  |  |  | 1girl, barefoot, large_breasts, nipples, solo, arms_behind_back, blush, bondage, feet, looking_at_viewer, navel, rope, toes, completely_nude, pussy_juice, restrained, shibari, smile, spread_legs, sweat, thighs, clitoris, closed_mouth, jewelry, mosaic_censoring, squatting, uncensored |
| 8 | 7 |  |  |  |  |  | 1girl, blush, hetero, nipples, thighhighs, mmf_threesome, multiple_penises, anal, dark-skinned_male, nude, open_mouth, vaginal, 2boys, ass, blunt_bangs, double_penetration, interracial, jewelry, large_breasts, tongue_out, ahegao, cum, faceless_male, gloves, medium_breasts, mosaic_censoring, pussy, solo_focus, sweat, tiara |
| 9 | 5 |  |  |  |  |  | 1girl, navel, smile, beach, cleavage, cloud, hair_flower, looking_at_viewer, solo, white_bikini, alternate_costume, blue_sky, day, jewelry, medium_breasts, ocean, open_mouth, outdoors, collarbone, sitting, thighs, water |
| 10 | 6 |  |  |  |  |  | 1girl, bondage, gagged, rope, arms_behind_back, blush, solo, improvised_gag, jewelry, black_thighhighs, cleavage, large_breasts, medium_breasts, navel, orange_hair, panties, shibari |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | fingerless_gloves | jewelry | looking_at_viewer | simple_background | smile | solo | tiara | armor | bare_shoulders | detached_collar | black_gloves | white_dress | upper_body | closed_mouth | cleavage | medium_breasts | white_background | dress | holding_sword | black_thighhighs | zettai_ryouiki | boots | full_body | cowboy_shot | navel | nipples | completely_nude | large_breasts | pussy | collarbone | outdoors | standing | thighs | water | blush | day | nature | wading | yellow_eyes | solo_focus | thighhighs | hetero | open_mouth | 1boy | breasts_out | clothed_sex | cowgirl_position | cum_in_pussy | girl_on_top | penis | vaginal | spread_legs | sweat | bar_censor | dress_lift | dress_pull | fellatio | nude | testicles | sex | barefoot | arms_behind_back | bondage | feet | rope | toes | pussy_juice | restrained | shibari | clitoris | mosaic_censoring | squatting | uncensored | mmf_threesome | multiple_penises | anal | dark-skinned_male | 2boys | ass | blunt_bangs | double_penetration | interracial | tongue_out | ahegao | cum | faceless_male | gloves | beach | cloud | hair_flower | white_bikini | alternate_costume | blue_sky | ocean | sitting | gagged | improvised_gag | orange_hair | panties |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------------|:----------|:--------------------|:--------------------|:--------|:-------|:--------|:--------|:-----------------|:------------------|:---------------|:--------------|:-------------|:---------------|:-----------|:-----------------|:-------------------|:--------|:----------------|:-------------------|:-----------------|:--------|:------------|:--------------|:--------|:----------|:------------------|:----------------|:--------|:-------------|:-----------|:-----------|:---------|:--------|:--------|:------|:---------|:---------|:--------------|:-------------|:-------------|:---------|:-------------|:-------|:--------------|:--------------|:-------------------|:---------------|:--------------|:--------|:----------|:--------------|:--------|:-------------|:-------------|:-------------|:-----------|:-------|:------------|:------|:-----------|:-------------------|:----------|:-------|:-------|:-------|:--------------|:-------------|:----------|:-----------|:-------------------|:------------|:-------------|:----------------|:-------------------|:-------|:--------------------|:--------|:------|:--------------|:---------------------|:--------------|:-------------|:---------|:------|:----------------|:---------|:--------|:--------|:--------------|:---------------|:--------------------|:-----------|:--------|:----------|:---------|:-----------------|:--------------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | X | | | | X | X | | X | X | X | | | | X | | | | | | | | | | X | | X | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | | X | | | | | X | | X | | X | | | | | | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | X | | | | | X | | X | X | X | | | | X | | X | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | X | | | | | | | | | | | X | X | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | | | X | | | | | | | | | | X | | X | X | | | | | | X | | | | | X | X | X | X | | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | | X | X | | | | | | | | | X | | | | | X | X | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | |
| 10 | 6 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | X | X | | | | X | | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
|
jjonhwa/SECOND_SUMMARY_RAW | ---
dataset_info:
features:
- name: 문장
dtype: string
- name: 요약
dtype: string
splits:
- name: train
num_bytes: 104066240
num_examples: 30979
download_size: 62857147
dataset_size: 104066240
---
# Dataset Card for "SECOND_SUMMARY_RAW"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/TinyImagenet_200_validation | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': goldfish
'1': fire salamander
'2': American bullfrog
'3': tailed frog
'4': American alligator
'5': boa constrictor
'6': trilobite
'7': scorpion
'8': southern black widow
'9': tarantula
'10': centipede
'11': koala
'12': jellyfish
'13': brain coral
'14': snail
'15': sea slug
'16': American lobster
'17': spiny lobster
'18': black stork
'19': king penguin
'20': albatross
'21': dugong
'22': Yorkshire Terrier
'23': Golden Retriever
'24': Labrador Retriever
'25': German Shepherd Dog
'26': Standard Poodle
'27': tabby cat
'28': Persian cat
'29': Egyptian Mau
'30': cougar
'31': lion
'32': brown bear
'33': ladybug
'34': grasshopper
'35': stick insect
'36': cockroach
'37': praying mantis
'38': dragonfly
'39': monarch butterfly
'40': sulphur butterfly
'41': sea cucumber
'42': guinea pig
'43': pig
'44': ox
'45': bison
'46': bighorn sheep
'47': gazelle
'48': arabian camel
'49': orangutan
'50': chimpanzee
'51': baboon
'52': African bush elephant
'53': red panda
'54': abacus
'55': academic gown
'56': altar
'57': backpack
'58': baluster / handrail
'59': barbershop
'60': barn
'61': barrel
'62': basketball
'63': bathtub
'64': station wagon
'65': lighthouse
'66': beaker
'67': beer bottle
'68': bikini
'69': binoculars
'70': birdhouse
'71': bow tie
'72': brass memorial plaque
'73': bucket
'74': high-speed train
'75': butcher shop
'76': candle
'77': cannon
'78': cardigan
'79': automated teller machine
'80': CD player
'81': storage chest
'82': Christmas stocking
'83': cliff dwelling
'84': computer keyboard
'85': candy store
'86': convertible
'87': crane bird
'88': dam
'89': desk
'90': dining table
'91': dumbbell
'92': flagpole
'93': fly
'94': fountain
'95': freight car
'96': frying pan
'97': fur coat
'98': gas mask or respirator
'99': go-kart
'100': gondola
'101': hourglass
'102': iPod
'103': rickshaw
'104': kimono
'105': lampshade
'106': lawn mower
'107': lifeboat
'108': limousine
'109': magnetic compass
'110': maypole
'111': military uniform
'112': miniskirt
'113': moving van
'114': neck brace
'115': obelisk
'116': oboe
'117': pipe organ
'118': parking meter
'119': payphone
'120': picket fence
'121': pill bottle
'122': plunger
'123': police van
'124': poncho
'125': soda bottle
'126': potter's wheel
'127': missile
'128': punching bag
'129': refrigerator
'130': remote control
'131': rocking chair
'132': rugby ball
'133': sandal
'134': school bus
'135': scoreboard
'136': sewing machine
'137': snorkel
'138': sock
'139': sombrero
'140': space heater
'141': spider web
'142': sports car
'143': through arch bridge
'144': stopwatch
'145': sunglasses
'146': suspension bridge
'147': swim trunks / shorts
'148': syringe
'149': teapot
'150': teddy bear
'151': thatched roof
'152': torch
'153': tractor
'154': triumphal arch
'155': trolleybus
'156': turnstile
'157': umbrella
'158': vestment
'159': viaduct
'160': volleyball
'161': water jug
'162': water tower
'163': wok
'164': wooden spoon
'165': comic book
'166': fishing casting reel
'167': guacamole
'168': ice cream
'169': popsicle
'170': goose
'171': drumstick
'172': plate
'173': pretzel
'174': mashed potatoes
'175': cauliflower
'176': bell pepper
'177': lemon
'178': banana
'179': pomegranate
'180': meatloaf
'181': pizza
'182': pot pie
'183': espresso
'184': bee
'185': apron
'186': pole
'187': Chihuahua
'188': mountain
'189': cliff
'190': coral reef
'191': lakeshore
'192': beach
'193': acorn
'194': broom
'195': mushroom
'196': metal nail
'197': chain
'198': slug
'199': orange
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_ViT_L_14_ensemble_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
- name: id
dtype: int64
splits:
- name: validation
num_bytes: 507583.0
num_examples: 200
download_size: 372919
dataset_size: 507583.0
---
# Dataset Card for "TinyImagenet_200_validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Barsius/fortrain | ---
license: unknown
---
|
liuyanchen1015/MULTI_VALUE_qqp_drop_copula_be_locative | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 230999
num_examples: 1408
- name: test
num_bytes: 2447761
num_examples: 15005
- name: train
num_bytes: 2019627
num_examples: 12431
download_size: 2903431
dataset_size: 4698387
---
# Dataset Card for "MULTI_VALUE_qqp_drop_copula_be_locative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/crossangetenshitoryuunorondo | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Cross Ange - Tenshi To Ryuu No Rondo
This is the image base of bangumi Cross Ange - Tenshi to Ryuu no Rondo, we detected 67 characters, 4478 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 40 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 111 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 30 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 22 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 229 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 64 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 117 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 67 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 179 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 40 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 38 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 40 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 24 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 32 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 79 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 69 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 21 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 28 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 201 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 90 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 601 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 121 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 95 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 56 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 96 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 25 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 32 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 18 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 13 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 37 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 13 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 12 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 24 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 156 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 16 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 74 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 19 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 14 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 80 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 33 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 23 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 8 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 24 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 21 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 23 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 195 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 9 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 23 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 15 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 42 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 62 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 24 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 8 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 16 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 279 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 42 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 7 | [Download](56/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 57 | 115 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 19 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 20 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 8 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 18 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 6 | [Download](62/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 63 | 9 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 5 | [Download](64/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 65 | 6 | [Download](65/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 395 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
st4lk1981/titou | ---
license: cc
---
|
ra100/lower-decks | ---
license: openrail
pretty_name: Star Trek Lower Decks people
size_categories:
- n<1K
configs:
- config_name: default
data_files:
- split: 'people-closeup'
path: 'data/people-closeup/*'
- split: 'people-group'
path: 'data/people-group/*'
- split: 'people-medium'
path: 'data/people-medium/*'
- split: 'people-non-uniform'
path: 'data/people-non-uniform/*'
- split: 'people-wide'
path: 'data/people-wide/*'
---
|
mindcrafterjesse/FursuitsAndArt | ---
license: unlicense
---
|
abhayesian/alpaca-cleaned | ---
dataset_info:
features:
- name: output
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 18833462
num_examples: 52002
download_size: 11952128
dataset_size: 18833462
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
allegro/cst-wikinews-en | ---
license: apache-2.0
task_categories:
- text-classification
language:
- pl
- en
pretty_name: Cst-Wikinews translated to English
size_categories:
- n<1K
---
All instances from the `clarin-pl/cst-wikinews` (train, val, test) translated to English with Google Translate API.
Columns:
- `source` - text instance in Polish.
- `target` - text instance in English. |
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-29029e-2376274531 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: google/bigbird-pegasus-large-bigpatent
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-bigpatent
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
tr416/dataset_20231007_135609 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73790
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231007_135609"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
swda | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|other-Switchboard-1 Telephone Speech Corpus, Release 2
task_categories:
- text-classification
task_ids:
- multi-label-classification
pretty_name: The Switchboard Dialog Act Corpus (SwDA)
dataset_info:
features:
- name: swda_filename
dtype: string
- name: ptb_basename
dtype: string
- name: conversation_no
dtype: int64
- name: transcript_index
dtype: int64
- name: act_tag
dtype:
class_label:
names:
'0': b^m^r
'1': qw^r^t
'2': aa^h
'3': br^m
'4': fa^r
'5': aa,ar
'6': sd^e(^q)^r
'7': ^2
'8': sd;qy^d
'9': oo
'10': bk^m
'11': aa^t
'12': cc^t
'13': qy^d^c
'14': qo^t
'15': ng^m
'16': qw^h
'17': qo^r
'18': aa
'19': qy^d^t
'20': qrr^d
'21': br^r
'22': fx
'23': sd,qy^g
'24': ny^e
'25': ^h^t
'26': fc^m
'27': qw(^q)
'28': co
'29': o^t
'30': b^m^t
'31': qr^d
'32': qw^g
'33': ad(^q)
'34': qy(^q)
'35': na^r
'36': am^r
'37': qr^t
'38': ad^c
'39': qw^c
'40': bh^r
'41': h^t
'42': ft^m
'43': ba^r
'44': qw^d^t
'45': '%'
'46': t3
'47': nn
'48': bd
'49': h^m
'50': h^r
'51': sd^r
'52': qh^m
'53': ^q^t
'54': sv^2
'55': ft
'56': ar^m
'57': qy^h
'58': sd^e^m
'59': qh^r
'60': cc
'61': fp^m
'62': ad
'63': qo
'64': na^m^t
'65': fo^c
'66': qy
'67': sv^e^r
'68': aap
'69': 'no'
'70': aa^2
'71': sv(^q)
'72': sv^e
'73': nd
'74': '"'
'75': bf^2
'76': bk
'77': fp
'78': nn^r^t
'79': fa^c
'80': ny^t
'81': ny^c^r
'82': qw
'83': qy^t
'84': b
'85': fo
'86': qw^r
'87': am
'88': bf^t
'89': ^2^t
'90': b^2
'91': x
'92': fc
'93': qr
'94': no^t
'95': bk^t
'96': bd^r
'97': bf
'98': ^2^g
'99': qh^c
'100': ny^c
'101': sd^e^r
'102': br
'103': fe
'104': by
'105': ^2^r
'106': fc^r
'107': b^m
'108': sd,sv
'109': fa^t
'110': sv^m
'111': qrr
'112': ^h^r
'113': na
'114': fp^r
'115': o
'116': h,sd
'117': t1^t
'118': nn^r
'119': cc^r
'120': sv^c
'121': co^t
'122': qy^r
'123': sv^r
'124': qy^d^h
'125': sd
'126': nn^e
'127': ny^r
'128': b^t
'129': ba^m
'130': ar
'131': bf^r
'132': sv
'133': bh^m
'134': qy^g^t
'135': qo^d^c
'136': qo^d
'137': nd^t
'138': aa^r
'139': sd^2
'140': sv;sd
'141': qy^c^r
'142': qw^m
'143': qy^g^r
'144': no^r
'145': qh(^q)
'146': sd;sv
'147': bf(^q)
'148': +
'149': qy^2
'150': qw^d
'151': qy^g
'152': qh^g
'153': nn^t
'154': ad^r
'155': oo^t
'156': co^c
'157': ng
'158': ^q
'159': qw^d^c
'160': qrr^t
'161': ^h
'162': aap^r
'163': bc^r
'164': sd^m
'165': bk^r
'166': qy^g^c
'167': qr(^q)
'168': ng^t
'169': arp
'170': h
'171': bh
'172': sd^c
'173': ^g
'174': o^r
'175': qy^c
'176': sd^e
'177': fw
'178': ar^r
'179': qy^m
'180': bc
'181': sv^t
'182': aap^m
'183': sd;no
'184': ng^r
'185': bf^g
'186': sd^e^t
'187': o^c
'188': b^r
'189': b^m^g
'190': ba
'191': t1
'192': qy^d(^q)
'193': nn^m
'194': ny
'195': ba,fe
'196': aa^m
'197': qh
'198': na^m
'199': oo(^q)
'200': qw^t
'201': na^t
'202': qh^h
'203': qy^d^m
'204': ny^m
'205': fa
'206': qy^d
'207': fc^t
'208': sd(^q)
'209': qy^d^r
'210': bf^m
'211': sd(^q)^t
'212': ft^t
'213': ^q^r
'214': sd^t
'215': sd(^q)^r
'216': ad^t
- name: damsl_act_tag
dtype:
class_label:
names:
'0': ad
'1': qo
'2': qy
'3': arp_nd
'4': sd
'5': h
'6': bh
'7': 'no'
'8': ^2
'9': ^g
'10': ar
'11': aa
'12': sv
'13': bk
'14': fp
'15': qw
'16': b
'17': ba
'18': t1
'19': oo_co_cc
'20': +
'21': ny
'22': qw^d
'23': x
'24': qh
'25': fc
'26': fo_o_fw_"_by_bc
'27': aap_am
'28': '%'
'29': bf
'30': t3
'31': nn
'32': bd
'33': ng
'34': ^q
'35': br
'36': qy^d
'37': fa
'38': ^h
'39': b^m
'40': ft
'41': qrr
'42': na
- name: caller
dtype: string
- name: utterance_index
dtype: int64
- name: subutterance_index
dtype: int64
- name: text
dtype: string
- name: pos
dtype: string
- name: trees
dtype: string
- name: ptb_treenumbers
dtype: string
- name: talk_day
dtype: string
- name: length
dtype: int64
- name: topic_description
dtype: string
- name: prompt
dtype: string
- name: from_caller
dtype: int64
- name: from_caller_sex
dtype: string
- name: from_caller_education
dtype: int64
- name: from_caller_birth_year
dtype: int64
- name: from_caller_dialect_area
dtype: string
- name: to_caller
dtype: int64
- name: to_caller_sex
dtype: string
- name: to_caller_education
dtype: int64
- name: to_caller_birth_year
dtype: int64
- name: to_caller_dialect_area
dtype: string
splits:
- name: train
num_bytes: 128498512
num_examples: 213543
- name: validation
num_bytes: 34749819
num_examples: 56729
- name: test
num_bytes: 2560127
num_examples: 4514
download_size: 14456364
dataset_size: 165808458
---
# Dataset Card for SwDA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [The Switchboard Dialog Act Corpus](http://compprag.christopherpotts.net/swda.html)
- **Repository:** [NathanDuran/Switchboard-Corpus](https://github.com/cgpotts/swda)
- **Paper:** [The Switchboard Dialog Act Corpus](http://compprag.christopherpotts.net/swda.html)
= **Leaderboard: [Dialogue act classification](https://github.com/sebastianruder/NLP-progress/blob/master/english/dialogue.md#dialogue-act-classification)**
- **Point of Contact:** [Christopher Potts](https://web.stanford.edu/~cgpotts/)
### Dataset Summary
The Switchboard Dialog Act Corpus (SwDA) extends the Switchboard-1 Telephone Speech Corpus, Release 2 with
turn/utterance-level dialog-act tags. The tags summarize syntactic, semantic, and pragmatic information about the
associated turn. The SwDA project was undertaken at UC Boulder in the late 1990s.
The SwDA is not inherently linked to the Penn Treebank 3 parses of Switchboard, and it is far from straightforward to
align the two resources. In addition, the SwDA is not distributed with the Switchboard's tables of metadata about the
conversations and their participants.
### Supported Tasks and Leaderboards
| Model | Accuracy | Paper / Source | Code |
| ------------- | :-----:| --- | --- |
| H-Seq2seq (Colombo et al., 2020) | 85.0 | [Guiding attention in Sequence-to-sequence models for Dialogue Act prediction](https://ojs.aaai.org/index.php/AAAI/article/view/6259/6115)
| SGNN (Ravi et al., 2018) | 83.1 | [Self-Governing Neural Networks for On-Device Short Text Classification](https://www.aclweb.org/anthology/D18-1105.pdf)
| CASA (Raheja et al., 2019) | 82.9 | [Dialogue Act Classification with Context-Aware Self-Attention](https://www.aclweb.org/anthology/N19-1373.pdf)
| DAH-CRF (Li et al., 2019) | 82.3 | [A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification](https://www.aclweb.org/anthology/K19-1036.pdf)
| ALDMN (Wan et al., 2018) | 81.5 | [Improved Dynamic Memory Network for Dialogue Act Classification with Adversarial Training](https://arxiv.org/pdf/1811.05021.pdf)
| CRF-ASN (Chen et al., 2018) | 81.3 | [Dialogue Act Recognition via CRF-Attentive Structured Network](https://arxiv.org/abs/1711.05568)
| Pretrained H-Transformer (Chapuis et al., 2020) | 79.3 | [Hierarchical Pre-training for Sequence Labelling in Spoken Dialog] (https://www.aclweb.org/anthology/2020.findings-emnlp.239)
| Bi-LSTM-CRF (Kumar et al., 2017) | 79.2 | [Dialogue Act Sequence Labeling using Hierarchical encoder with CRF](https://arxiv.org/abs/1709.04250) | [Link](https://github.com/YanWenqiang/HBLSTM-CRF) |
| RNN with 3 utterances in context (Bothe et al., 2018) | 77.34 | [A Context-based Approach for Dialogue Act Recognition using Simple Recurrent Neural Networks](https://arxiv.org/abs/1805.06280) | |
### Languages
The language supported is English.
## Dataset Structure
Utterance are tagged with the [SWBD-DAMSL](https://web.stanford.edu/~jurafsky/ws97/manual.august1.html) DA.
### Data Instances
An example from the dataset is:
`{'act_tag': 115, 'caller': 'A', 'conversation_no': 4325, 'damsl_act_tag': 26, 'from_caller': 1632, 'from_caller_birth_year': 1962, 'from_caller_dialect_area': 'WESTERN', 'from_caller_education': 2, 'from_caller_sex': 'FEMALE', 'length': 5, 'pos': 'Okay/UH ./.', 'prompt': 'FIND OUT WHAT CRITERIA THE OTHER CALLER WOULD USE IN SELECTING CHILD CARE SERVICES FOR A PRESCHOOLER. IS IT EASY OR DIFFICULT TO FIND SUCH CARE?', 'ptb_basename': '4/sw4325', 'ptb_treenumbers': '1', 'subutterance_index': 1, 'swda_filename': 'sw00utt/sw_0001_4325.utt', 'talk_day': '03/23/1992', 'text': 'Okay. /', 'to_caller': 1519, 'to_caller_birth_year': 1971, 'to_caller_dialect_area': 'SOUTH MIDLAND', 'to_caller_education': 1, 'to_caller_sex': 'FEMALE', 'topic_description': 'CHILD CARE', 'transcript_index': 0, 'trees': '(INTJ (UH Okay) (. .) (-DFL- E_S))', 'utterance_index': 1}`
### Data Fields
* `swda_filename`: (str) The filename: directory/basename.
* `ptb_basename`: (str) The Treebank filename: add ".pos" for POS and ".mrg" for trees
* `conversation_no`: (int) The conversation Id, to key into the metadata database.
* `transcript_index`: (int) The line number of this item in the transcript (counting only utt lines).
* `act_tag`: (list of str) The Dialog Act Tags (separated by ||| in the file). Check Dialog act annotations for more details.
* `damsl_act_tag`: (list of str) The Dialog Act Tags of the 217 variation tags.
* `caller`: (str) A, B, @A, @B, @@A, @@B
* `utterance_index`: (int) The encoded index of the utterance (the number in A.49, B.27, etc.)
* `subutterance_index`: (int) Utterances can be broken across line. This gives the internal position.
* `text`: (str) The text of the utterance
* `pos`: (str) The POS tagged version of the utterance, from PtbBasename+.pos
* `trees`: (str) The tree(s) containing this utterance (separated by ||| in the file). Use `[Tree.fromstring(t) for t in row_value.split("|||")]` to convert to (list of nltk.tree.Tree).
* `ptb_treenumbers`: (list of int) The tree numbers in the PtbBasename+.mrg
* `talk_day`: (str) Date of talk.
* `length`: (int) Length of talk in seconds.
* `topic_description`: (str) Short description of topic that's being discussed.
* `prompt`: (str) Long decription/query/instruction.
* `from_caller`: (int) The numerical Id of the from (A) caller.
* `from_caller_sex`: (str) MALE, FEMALE.
* `from_caller_education`: (int) Called education level 0, 1, 2, 3, 9.
* `from_caller_birth_year`: (int) Caller birth year YYYY.
* `from_caller_dialect_area`: (str) MIXED, NEW ENGLAND, NORTH MIDLAND, NORTHERN, NYC, SOUTH MIDLAND, SOUTHERN, UNK, WESTERN.
* `to_caller`: (int) The numerical Id of the to (B) caller.
* `to_caller_sex`: (str) MALE, FEMALE.
* `to_caller_education`: (int) Called education level 0, 1, 2, 3, 9.
* `to_caller_birth_year`: (int) Caller birth year YYYY.
* `to_caller_dialect_area`: (str) MIXED, NEW ENGLAND, NORTH MIDLAND, NORTHERN, NYC, SOUTH MIDLAND, SOUTHERN, UNK, WESTERN.
### Dialog act annotations
| | name | act_tag | example | train_count | full_count |
|----- |------------------------------- |---------------- |-------------------------------------------------- |------------- |------------ |
| 1 | Statement-non-opinion | sd | Me, I'm in the legal department. | 72824 | 75145 |
| 2 | Acknowledge (Backchannel) | b | Uh-huh. | 37096 | 38298 |
| 3 | Statement-opinion | sv | I think it's great | 25197 | 26428 |
| 4 | Agree/Accept | aa | That's exactly it. | 10820 | 11133 |
| 5 | Abandoned or Turn-Exit | % | So, - | 10569 | 15550 |
| 6 | Appreciation | ba | I can imagine. | 4633 | 4765 |
| 7 | Yes-No-Question | qy | Do you have to have any special training? | 4624 | 4727 |
| 8 | Non-verbal | x | [Laughter], [Throat_clearing] | 3548 | 3630 |
| 9 | Yes answers | ny | Yes. | 2934 | 3034 |
| 10 | Conventional-closing | fc | Well, it's been nice talking to you. | 2486 | 2582 |
| 11 | Uninterpretable | % | But, uh, yeah | 2158 | 15550 |
| 12 | Wh-Question | qw | Well, how old are you? | 1911 | 1979 |
| 13 | No answers | nn | No. | 1340 | 1377 |
| 14 | Response Acknowledgement | bk | Oh, okay. | 1277 | 1306 |
| 15 | Hedge | h | I don't know if I'm making any sense or not. | 1182 | 1226 |
| 16 | Declarative Yes-No-Question | qy^d | So you can afford to get a house? | 1174 | 1219 |
| 17 | Other | fo_o_fw_by_bc | Well give me a break, you know. | 1074 | 883 |
| 18 | Backchannel in question form | bh | Is that right? | 1019 | 1053 |
| 19 | Quotation | ^q | You can't be pregnant and have cats | 934 | 983 |
| 20 | Summarize/reformulate | bf | Oh, you mean you switched schools for the kids. | 919 | 952 |
| 21 | Affirmative non-yes answers | na | It is. | 836 | 847 |
| 22 | Action-directive | ad | Why don't you go first | 719 | 746 |
| 23 | Collaborative Completion | ^2 | Who aren't contributing. | 699 | 723 |
| 24 | Repeat-phrase | b^m | Oh, fajitas | 660 | 688 |
| 25 | Open-Question | qo | How about you? | 632 | 656 |
| 26 | Rhetorical-Questions | qh | Who would steal a newspaper? | 557 | 575 |
| 27 | Hold before answer/agreement | ^h | I'm drawing a blank. | 540 | 556 |
| 28 | Reject | ar | Well, no | 338 | 346 |
| 29 | Negative non-no answers | ng | Uh, not a whole lot. | 292 | 302 |
| 30 | Signal-non-understanding | br | Excuse me? | 288 | 298 |
| 31 | Other answers | no | I don't know | 279 | 286 |
| 32 | Conventional-opening | fp | How are you? | 220 | 225 |
| 33 | Or-Clause | qrr | or is it more of a company? | 207 | 209 |
| 34 | Dispreferred answers | arp_nd | Well, not so much that. | 205 | 207 |
| 35 | 3rd-party-talk | t3 | My goodness, Diane, get down from there. | 115 | 117 |
| 36 | Offers, Options, Commits | oo_co_cc | I'll have to check that out | 109 | 110 |
| 37 | Self-talk | t1 | What's the word I'm looking for | 102 | 103 |
| 38 | Downplayer | bd | That's all right. | 100 | 103 |
| 39 | Maybe/Accept-part | aap_am | Something like that | 98 | 105 |
| 40 | Tag-Question | ^g | Right? | 93 | 92 |
| 41 | Declarative Wh-Question | qw^d | You are what kind of buff? | 80 | 80 |
| 42 | Apology | fa | I'm sorry. | 76 | 79 |
| 43 | Thanking | ft | Hey thanks a lot | 67 | 78 |
### Data Splits
I used info from the [Probabilistic-RNN-DA-Classifier](https://github.com/NathanDuran/Probabilistic-RNN-DA-Classifier) repo:
The same training and test splits as used by [Stolcke et al. (2000)](https://web.stanford.edu/~jurafsky/ws97).
The development set is a subset of the training set to speed up development and testing used in the paper [Probabilistic Word Association for Dialogue Act Classification with Recurrent Neural Networks](https://www.researchgate.net/publication/326640934_Probabilistic_Word_Association_for_Dialogue_Act_Classification_with_Recurrent_Neural_Networks_19th_International_Conference_EANN_2018_Bristol_UK_September_3-5_2018_Proceedings).
|Dataset |# Transcripts |# Utterances |
|-----------|:-------------:|:-------------:|
|Training |1115 |192,768 |
|Validation |21 |3,196 |
|Test |19 |4,088 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The SwDA is not inherently linked to the Penn Treebank 3 parses of Switchboard, and it is far from straightforward to align the two resources Calhoun et al. 2010, §2.4. In addition, the SwDA is not distributed with the Switchboard's tables of metadata about the conversations and their participants.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[Christopher Potts](https://web.stanford.edu/~cgpotts/), Stanford Linguistics.
### Licensing Information
This work is licensed under a [Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.](http://creativecommons.org/licenses/by-nc-sa/3.0/)
### Citation Information
```
@techreport{Jurafsky-etal:1997,
Address = {Boulder, CO},
Author = {Jurafsky, Daniel and Shriberg, Elizabeth and Biasca, Debra},
Institution = {University of Colorado, Boulder Institute of Cognitive Science},
Number = {97-02},
Title = {Switchboard {SWBD}-{DAMSL} Shallow-Discourse-Function Annotation Coders Manual, Draft 13},
Year = {1997}}
@article{Shriberg-etal:1998,
Author = {Shriberg, Elizabeth and Bates, Rebecca and Taylor, Paul and Stolcke, Andreas and Jurafsky, Daniel and Ries, Klaus and Coccaro, Noah and Martin, Rachel and Meteer, Marie and Van Ess-Dykema, Carol},
Journal = {Language and Speech},
Number = {3--4},
Pages = {439--487},
Title = {Can Prosody Aid the Automatic Classification of Dialog Acts in Conversational Speech?},
Volume = {41},
Year = {1998}}
@article{Stolcke-etal:2000,
Author = {Stolcke, Andreas and Ries, Klaus and Coccaro, Noah and Shriberg, Elizabeth and Bates, Rebecca and Jurafsky, Daniel and Taylor, Paul and Martin, Rachel and Meteer, Marie and Van Ess-Dykema, Carol},
Journal = {Computational Linguistics},
Number = {3},
Pages = {339--371},
Title = {Dialogue Act Modeling for Automatic Tagging and Recognition of Conversational Speech},
Volume = {26},
Year = {2000}}
```
### Contributions
Thanks to [@gmihaila](https://github.com/gmihaila) for adding this dataset. |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-27000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1034670
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vfleaking/hh-redteam-instruction33K | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13888522
num_examples: 33407
download_size: 7097684
dataset_size: 13888522
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alex-tecky/common_voice_zh_hk_processed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_features
sequence:
sequence:
sequence: float32
- name: labels
sequence: int64
- name: input_length
dtype: float64
splits:
- name: train
num_bytes: 13464160656.0
num_examples: 14018
- name: test
num_bytes: 5372062988
num_examples: 5593
download_size: 3041478840
dataset_size: 18836223644.0
---
# Dataset Card for "common_voice_zh_hk_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb-pt/bucc-bitext-mining | ---
configs:
- config_name: de-pt
data_files:
- split: test
path: de-pt/test*
- config_name: fr-pt
data_files:
- split: test
path: fr-pt/test*
- config_name: ru-pt
data_files:
- split: test
path: ru-pt/test*
- config_name: zh-pt
data_files:
- split: test
path: zh-pt/test*
--- |
CyberHarem/ranger_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ranger/レンジャー/突击者 (Azur Lane)
This is the dataset of ranger/レンジャー/突击者 (Azur Lane), containing 62 images and their tags.
The core tags of this character are `long_hair, breasts, blue_eyes, large_breasts, hair_between_eyes, pink_hair, red_hair, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 62 | 81.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ranger_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 62 | 48.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ranger_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 150 | 99.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ranger_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 62 | 71.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ranger_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 150 | 138.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ranger_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ranger_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | cleavage, looking_at_viewer, 1girl, bare_shoulders, blush, hat, solo, detached_sleeves, navel, collarbone, cowboy_shot, red_gloves, white_bikini, white_headwear, bikini_top_only, black_pants, front-tie_bikini_top, open_mouth, retrofit_(azur_lane), simple_background, anchor_choker, closed_mouth, midriff, stomach, streaked_hair, sweat, white_background |
| 1 | 15 |  |  |  |  |  | 1girl, detached_sleeves, solo, blush, cleavage, looking_at_viewer, pink_gloves, bare_shoulders, collarbone, headset, white_shirt, choker, partially_fingerless_gloves, open_mouth, skirt, white_background, between_breasts, jacket, open_vest, purple_eyes, simple_background, sleeveless, smile, thighhighs, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | cleavage | looking_at_viewer | 1girl | bare_shoulders | blush | hat | solo | detached_sleeves | navel | collarbone | cowboy_shot | red_gloves | white_bikini | white_headwear | bikini_top_only | black_pants | front-tie_bikini_top | open_mouth | retrofit_(azur_lane) | simple_background | anchor_choker | closed_mouth | midriff | stomach | streaked_hair | sweat | white_background | pink_gloves | headset | white_shirt | choker | partially_fingerless_gloves | skirt | between_breasts | jacket | open_vest | purple_eyes | sleeveless | smile | thighhighs | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------|:--------------------|:--------|:-----------------|:--------|:------|:-------|:-------------------|:--------|:-------------|:--------------|:-------------|:---------------|:-----------------|:------------------|:--------------|:-----------------------|:-------------|:-----------------------|:--------------------|:----------------|:---------------|:----------|:----------|:----------------|:--------|:-------------------|:--------------|:----------|:--------------|:---------|:------------------------------|:--------|:------------------|:---------|:------------|:--------------|:-------------|:--------|:-------------|:-------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | X | | X | X | | X | | | | | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
AndyLiu0104/Soldering-Data-Annotation-ControlNet | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 62348716.0
num_examples: 1356
download_size: 55832669
dataset_size: 62348716.0
---
# Dataset Card for "Soldering-Data-Annotation-ControlNet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dribar/Dan_Alpaca1 | ---
license: openrail
---
|
Falah/retro_style_photography | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 984913
num_examples: 10000
download_size: 13621
dataset_size: 984913
---
# Dataset Card for "retro_style_photography"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KasparZ/cyborg | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 818748
num_examples: 1442
download_size: 492101
dataset_size: 818748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cyborg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dipteshkanojia/llama-2-qe-2023-indic-multi | ---
license: cc-by-nc-sa-4.0
language:
- en
- hi
- mr
- gu
- ta
- te
tags:
- quality estimation
- llama-2-format
- instruction tuning
- wmt 2023 data
size_categories:
- 10K<n<100K
---
This is the WMT 2023 shared task dataset for fine-tuning meta-llama/Llama-2-13b-chat-hf model.
We have concatenated and shuffled En-Gu, Hi, Mr, Ta, Te data from both training and validation sets. We have excluded approx. > 10 sample prompts for in-context learning scenario with test set.
Our sample prompt is:
```
<s>[INST] <<SYS>> You are a quality estimation model which accuractely predicts the translation quality as mean z_score. For perfectly meaningful translation, predict high z_score and for a meaningless or erroneous translation predict low z_score. Penalize the z_score on translation errors within [TGT] based on source sentence in [SRC]. Do not consider any other exisiting translation evaluation metrics. <</SYS>> For the following translation from English to Marathi, [SRC] Mudiyettu performers purify themselves through fasting and prayer, then draw a huge image of goddess Kali, called as kalam, on the temple floor with coloured powders, wherein the spirit of the goddess is invoked. [/SRC][TGT] मुडियेट्टु कलाकार उपवास आणि प्रार्थनेद्वारे स्वतःला शुद्ध करतात, त्यानंतर मंदिराच्या मजल्यावर काळम नावाच्या देवीची मोठी प्रतिमा काढतात, ज्यात देवीच्या आत्म्याची प्रार्थना केली जाते. [/TGT], predict the z_score [/INST] z_score: -0.4986</s>
``` |
argilla/distilabel-docs | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
dtype: string
- name: generation_prompt
dtype: string
- name: raw_generation_responses
list:
- name: choices
list:
- name: finish_reason
dtype: string
- name: index
dtype: int64
- name: logprobs
dtype: 'null'
- name: text
dtype: string
- name: created
dtype: int64
- name: id
dtype: string
- name: model
dtype: string
- name: object
dtype: string
- name: usage
struct:
- name: completion_tokens
dtype: int64
- name: prompt_tokens
dtype: int64
- name: total_tokens
dtype: int64
- name: generations
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: areas
list:
- name: Authenticity & Reliability
struct:
- name: rating
dtype: string
- name: rationale
dtype: string
- name: Clarity & Transparency
struct:
- name: rating
dtype: string
- name: rationale
dtype: string
- name: Compliance with Intent
struct:
- name: rating
dtype: string
- name: rationale
dtype: string
- name: Practical Accuracy
struct:
- name: rating
dtype: string
- name: rationale
dtype: string
splits:
- name: train
num_bytes: 79809
num_examples: 5
download_size: 100998
dataset_size: 79809
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "distilabel-docs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lansinuote/diffusion.3.dream_booth | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 5590823.0
num_examples: 5
download_size: 5592148
dataset_size: 5590823.0
---
# Dataset Card for "diffusion.3.dream_booth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
issai/kazsandra | ---
pretty_name: Kazakh Sentiment Analysis Dataset of Reviews and Attitudes
dataset_info:
- config_name: full
features:
- name: custom_id
dtype: string
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
0: "0 stars"
1: "1 star"
2: "2 stars"
3: "3 stars"
4: "4 stars"
5: "5 stars"
- name: domain
dtype: string
splits:
- name: train
num_bytes: 24381051
num_examples: 180064
- config_name: polarity_classification
features:
- name: custom_id
dtype: string
- name: text
dtype: string
- name: text_cleaned
dtype: string
- name: label
dtype:
class_label:
names:
0: "negative"
1: "positive"
- name: domain
dtype: string
splits:
- name: train
num_bytes: 32618403
num_examples: 134368
- name: validation
num_bytes: 4085072
num_examples: 16796
- name: test
num_bytes: 4285278
num_examples: 16797
- config_name: score_classification
features:
- name: custom_id
dtype: string
- name: text
dtype: string
- name: text_cleaned
dtype: string
- name: label
dtype:
class_label:
names:
0: "1 star"
1: "2 stars"
2: "3 stars"
3: "4 stars"
4: "5 stars"
- name: domain
dtype: string
splits:
- name: train
num_bytes: 34107559
num_examples: 140126
- name: validation
num_bytes: 4318229
num_examples: 17516
- name: test
num_bytes: 4235569
num_examples: 17516
configs:
- config_name: full
data_files:
- split: train
path: "full/full.csv"
default: true
- config_name: polarity_classification
data_files:
- split: train
path: "polarity_classification/train_pc.csv"
- split: validation
path: "polarity_classification/valid_pc.csv"
- split: test
path: "polarity_classification/test_pc.csv"
- config_name: score_classification
data_files:
- split: train
path: "score_classification/train_sc.csv"
- split: validation
path: "score_classification/valid_sc.csv"
- split: test
path: "score_classification/test_sc.csv"
license: cc-by-4.0
task_categories:
- text-classification
task_ids:
- sentiment-classification
language:
- kk
size_categories:
- 100K<n<1M
---
## Dataset Description
- **Repository:** https://github.com/IS2AI/KazSAnDRA
- **Paper:** https://arxiv.org/abs/2403.19335
<h1 align = "center">KazSAnDRA </h1>
<p align = "justify"><b>Kaz</b>akh <b>S</b>entiment <b>An</b>alysis <b>D</b>ataset of <b>R</b>eviews and <b>A</b>ttitudes, or KazSAnDRA, is a <a href = "https://github.com/IS2AI/KazSAnDRA">dataset</a> developed for Kazakh sentiment analysis.
KazSAnDRA comprises a collection of 180,064 reviews obtained from various sources and includes numerical ratings ranging from 1 to 5, providing a quantitative representation of customer attitudes.
</p>
<p>
In the <a href = "https://arxiv.org/abs/2403.19335">original study</a>, KazSAnDRA was utilised for two distinct tasks:
<ol>
<li>polarity classification (PC), involving the prediction of whether a review is positive or negative:</li>
<ul>
<li>reviews with original scores of 1 or 2 were classified as negative and assigned a new score of 0,</li>
<li>reviews with original scores of 4 or 5 were classified as positive and assigned a new score of 1,</li>
<li>reviews with an original score of 3 were categorised as neutral and were excluded from the task.</li>
</ul>
<li>score classification (SC), where the objective was to predict the score of a review on a scale ranging from 1 to 5. To align with the enumeration used for labelling in the classifier, which starts from 0 rather than 1, labels 1–5 were transformed into 0–4.</li>
</ol>
</p>
<p align = "justify">
KazSAnDRA consists of seven CSV files. File <b>full.csv</b> contains all the 180,064 reviews and ratings from 1 to 5. Files <b>train_pc.csv</b>, <b>valid_pc.csv</b>, and
<b>test_pc.csv</b> are the training, validation, and testing sets for the polarity classification task, respectively. Files <b>train_sc.csv</b>, <b>valid_sc.csv</b>, and
<b>test_sc.csv</b> are the training, validation, and testing sets for the score classification task, in turn.
</p>
<p align = "justify">
All files, except for <b>full.csv</b>, include records containing a custom review identifier (<i>custom_id</i>), the original review text (<i>text</i>), the pre-processed review text (<i>text_cleaned</i>), the corresponding review score (<i>label</i>), and the domain information (<i>domain</i>).
File <b>full.csv</b> includes records containing a custom review identifier (<i>custom_id</i>), the original review text (<i>text</i>), the corresponding review score (<i>label</i>), and the domain information (<i>domain</i>).
</p>
<h2 align = "center">Dataset Statistics</h2>
<p align = "justify">For the sake of maintaining consistency and facilitating reproducibility of our experimental outcomes among different research groups, we partitioned KaZSAnDRA into three distinct sets: training (train), validation (valid), and testing (test) sets, following an 80/10/10 ratio.</p>
<table align="center">
<tr align="center">
<td rowspan="3"><b>Task</b></td>
<td colspan="2"><b>Train</b></td>
<td colspan="2"><b>Valid</b></td>
<td colspan="2"><b>Test</b></td>
<td colspan="2"><b>Total</b></td>
</tr>
<tr></tr>
<tr align="center">
<td><b>#</b></td>
<td><b>%</b></td>
<td><b>#</b></td>
<td><b>%</b></td>
<td><b>#</b></td>
<td><b>%</b></td>
<td><b>#</b></td>
<td><b>%</b></td>
</tr>
<tr></tr>
<tr align="center">
<td>PC</td>
<td>134,368</td>
<td>80</td>
<td>16,796</td>
<td>10</td>
<td>16,797</td>
<td>10</td>
<td>167,961</td>
<td>100</td>
</tr>
<tr></tr>
<tr align="center">
<td>SC</td>
<td>140,126</td>
<td>80</td>
<td>17,516</td>
<td>10</td>
<td>17,516</td>
<td>10</td>
<td>175,158</td>
<td>100</td>
</tr>
</table>
<p align = "justify">The distribution of reviews across the three sets based on their domains and scores for the PC task:</p>
<table align="center">
<thead>
<tr align="center">
<th rowspan="3">Domain</th>
<th colspan="2">Train</th>
<th colspan="2">Valid</th>
<th colspan="2">Test</th>
</tr>
<tr></tr>
<tr align="center">
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
</tr>
</thead>
<tbody>
<tr align="center">
<td>Appstore</td>
<td>101,477</td>
<td>75.52</td>
<td>12,685</td>
<td>75.52</td>
<td>12,685</td>
<td>75.52</td>
</tr>
<tr></tr>
<tr align="center">
<td>Market</td>
<td>22,561</td>
<td>16.79</td>
<td>2,820</td>
<td>16.79</td>
<td>2,820</td>
<td>16.79</td>
</tr>
<tr></tr>
<tr align="center">
<td>Mapping</td>
<td>6,509</td>
<td>4.84</td>
<td>813</td>
<td>4.84</td>
<td>814</td>
<td>4.85</td>
</tr>
<tr></tr>
<tr align="center">
<td>Bookstore</td>
<td>3,821</td>
<td>2.84</td>
<td>478</td>
<td>2.85</td>
<td>478</td>
<td>2.85</td>
</tr>
<tr></tr>
<tr align="center">
<td><b>Total</b></td>
<td><b>134,368</b></td>
<td><b>100</b></td>
<td><b>16,796</b></td>
<td><b>100</b></td>
<td><b>16,797</b></td>
<td><b>100</b></td>
</tr>
</tbody>
</table>
<table align="center">
<thead>
<tr align="center">
<th rowspan="3">Score</th>
<th colspan="2">Train</th>
<th colspan="2">Valid</th>
<th colspan="2">Test</th>
</tr>
<tr></tr>
<tr align="center">
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
</tr>
</thead>
<tbody>
<tr align="center">
<td>1</td>
<td>110,417</td>
<td>82.18</td>
<td>13,801</td>
<td>82.17</td>
<td>13,804</td>
<td>82.18</td>
</tr>
<tr></tr>
<tr align="center">
<td>0</td>
<td>23,951</td>
<td>17.82</td>
<td>2,995</td>
<td>17.83</td>
<td>2,993</td>
<td>17.82</td>
</tr>
<tr></tr>
<tr align="center">
<td><b>Total</b></td>
<td><b>134,368</b></td>
<td><b>100</b></td>
<td><b>16,796</b></td>
<td><b>100</b></td>
<td><b>16,797</b></td>
<td><b>100</b></td>
</tr>
</tbody>
</table>
<p align = "justify">The distribution of reviews across the three sets based on their domains and scores for the SC task:</p>
<table align="center">
<thead>
<tr align="center">
<th rowspan="3">Domain</th>
<th colspan="2">Train</th>
<th colspan="2">Valid</th>
<th colspan="2">Test</th>
</tr>
<tr></tr>
<tr align="center">
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
</tr>
</thead>
<tbody>
<tr align="center">
<td>Appstore</td>
<td>106,058</td>
<td>75.69</td>
<td>13,258</td>
<td>75.69</td>
<td>13,257</td>
<td>75.69</td>
</tr>
<tr></tr>
<tr align="center">
<td>Market</td>
<td>23,278</td>
<td>16.61</td>
<td>2,909</td>
<td>16.61</td>
<td>2,910</td>
<td>16.61</td>
</tr>
<tr></tr>
<tr align="center">
<td>Mapping</td>
<td>6,794</td>
<td>4.85</td>
<td>849</td>
<td>4.85</td>
<td>849</td>
<td>4.85</td>
</tr>
<tr></tr>
<tr align="center">
<td>Bookstore</td>
<td>3,996</td>
<td>2.85</td>
<td>500</td>
<td>2.85</td>
<td>500</td>
<td>2.85</td>
</tr>
<tr></tr>
<tr align="center">
<td><b>Total</b></td>
<td><b>140,126</b></td>
<td><b>100</b></td>
<td><b>17,516</b></td>
<td><b>100</b></td>
<td><b>17,516</b></td>
<td><b>100</b></td>
</tr>
</tbody>
</table>
<table align="center">
<thead>
<tr align="center">
<th rowspan="3">Score</th>
<th colspan="2">Train</th>
<th colspan="2">Valid</th>
<th colspan="2">Test</th>
</tr>
<tr></tr>
<tr align="center">
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
<th>#</th>
<th>%</th>
</tr>
</thead>
<tbody>
<tr align="center">
<td>5</td>
<td>101,302</td>
<td>72.29</td>
<td>12,663</td>
<td>72.29</td>
<td>12,663</td>
<td>72.29</td>
</tr>
<tr></tr>
<tr align="center">
<td>1</td>
<td>20,031</td>
<td>14.29</td>
<td>2,504</td>
<td>14.30</td>
<td>2,504</td>
<td>14.30</td>
</tr>
<tr></tr>
<tr align="center">
<td>4</td>
<td>9,115</td>
<td>6.50</td>
<td>1,140</td>
<td>6.51</td>
<td>1,139</td>
<td>6.50</td>
</tr>
<tr></tr>
<tr align="center">
<td>3</td>
<td>5,758</td>
<td>4.11</td>
<td>719</td>
<td>4.10</td>
<td>720</td>
<td>4.11</td>
</tr>
<tr></tr>
<tr align="center">
<td>2</td>
<td>3,920</td>
<td>2.80</td>
<td>490</td>
<td>2.80</td>
<td>490</td>
<td>2.80</td>
</tr>
<tr></tr>
<tr align="center">
<td><b>Total</b></td>
<td><b>140,126</b></td>
<td><b>100</b></td>
<td><b>17,516</b></td>
<td><b>100</b></td>
<td><b>17,517</b></td>
<td><b>100</b></td>
</tr>
</tbody>
</table>
<h2 align = "center">How to Use</h2>
<p align = "justify">To load the subsets of KazSAnDRA separately:</p>
```python
from datasets import load_dataset
full = load_dataset("issai/kazsandra", "full")
pc = load_dataset("issai/kazsandra", "polarity_classification")
sc = load_dataset("issai/kazsandra", "score_classification")
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.