datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
thai_toxicity_tweet | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- th
license:
- cc-by-nc-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: ThaiToxicityTweet
dataset_info:
features:
- name: tweet_id
dtype: string
- name: tweet_text
dtype: string
- name: toxic_votes
dtype: int32
- name: nontoxic_votes
dtype: int32
- name: is_toxic
dtype:
class_label:
names:
'0': neg
'1': pos
config_name: thai_toxicity_tweet
splits:
- name: train
num_bytes: 637387
num_examples: 3300
download_size: 194740
dataset_size: 637387
---
# Dataset Card for `thai_toxicity_tweet`
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/tmu-nlp/ThaiToxicityTweetCorpus/
- **Repository:** https://github.com/tmu-nlp/ThaiToxicityTweetCorpus/
- **Paper:** https://www.ta-cos.org/sites/ta-cos.org/files/1_W32.pdf
- **Leaderboard:**
- **Point of Contact:** https://www.ta-cos.org/sites/ta-cos.org/files/1_W32.pdf
### Dataset Summary
Thai Toxicity Tweet Corpus contains 3,300 tweets (506 tweets with texts missing) annotated by humans with guidelines including a 44-word dictionary.
The author obtained 2,027 and 1,273 toxic and non-toxic tweets, respectively; these were labeled by three annotators. The result of corpus
analysis indicates that tweets that include toxic words are not always toxic. Further, it is more likely that a tweet is toxic, if it contains
toxic words indicating their original meaning. Moreover, disagreements in annotation are primarily because of sarcasm, unclear existing
target, and word sense ambiguity.
Notes from data cleaner: The data is included into [huggingface/datasets](https://www.github.com/huggingface/datasets) in Dec 2020. By this time, 506 of the tweets are not available publicly anymore. We denote these by `TWEET_NOT_FOUND` in `tweet_text`.
Processing can be found at [this PR](https://github.com/tmu-nlp/ThaiToxicityTweetCorpus/pull/1).
### Supported Tasks and Leaderboards
text classification
### Languages
Thai (`th`)
## Dataset Structure
### Data Instances
```
{'is_toxic': 0, 'nontoxic_votes': 3, 'toxic_votes': 0, 'tweet_id': '898576382384418817', 'tweet_text': 'วันๆ นี่คุยกะหมา แมว หมู ไก่ ม้า ควาย มากกว่าคุยกับคนไปละ'}
{'is_toxic': 1, 'nontoxic_votes': 0, 'toxic_votes': 3, 'tweet_id': '898573084981985280', 'tweet_text': 'ควายแดงเมิงด่ารัฐบาลจนรองนายกป่วย พวกมึงกำลังทำลายชาติรู้มั้ย มั้ย มั้ย มั้ยยยยยยยยย news.voicetv.co.th/thailand/51672…'}
```
### Data Fields
"tweet_id": Id of tweet on Twitter
"tweet_text": text of the tweet
"toxic_votes": how many annotators say it is toxic, out of 3 annotators
"nontoxic_votes": how many annotators say it is NOT toxic, out of 3 annotators
"is_toxic": 1 if tweet is toxic else 0 (majority rules)
### Data Splits
No explicit split is given.
## Dataset Creation
### Curation Rationale
The dataset is created as part of [Sirihattasak et al (2019)](https://www.ta-cos.org/sites/ta-cos.org/files/1_W32.pdf).
### Source Data
#### Initial Data Collection and Normalization
The authors used the public Twitter Search API to collect 9,819 tweets from January–December 2017 based on our keyword dictionary. Then, they selected 75 tweets for each keyword. In total, they collected 3,300 tweets for annotation. To ensure quality of data, they set the following selection criteria.
1. All tweets are selected by humans to prevent word ambiguity. (The Twitter API selected the tweets based on characters in the keyword. For example, in the case of “บ้า(crazy),” the API will also select “บ้านนอก” (countryside)” which is not our target.)
2. The length of the tweet should be sufficiently long to discern the context of the tweet. Hence, they set five words as the minimum limit.
3. The tweets that contain only extremely toxic words, (for example: “damn, retard, bitch, f*ck, slut!!!”) are not considered.
4. In addition, they allowed tweets with English words if they were not critical elements in the labeling decision, for example, the word “f*ck.” As a result, our corpus contains English words, but they are less than 2% of the total.
All hashtags, re-tweets, and links were removed from these tweets. However, they did not delete emoticons because these emotional icons can imply the real intent of the post owners. Furthermore, only in the case of annotation, some entries such as the names of famous people were replaced with a tag <ไม่ขอเปิดเผยชื่อ>, for anonymity to prevent individual bias.
#### Who are the source language producers?
Twitter users in Thailand
### Annotations
#### Annotation process
We manually annotated our dataset with two labels: Toxic and Non-Toxic. We define a message as toxic if it indicates any harmful, damage, or negative intent based on our definition of toxicity. Furthermore, all the tweets were annotated by three annotators to identify toxicity; the conditions used for this identification are presented in the following list.
- A toxic message is a message that should be deleted or not be allowed in public.
- A message’s target or consequence must exist. It can either be an individual or a generalized group based on a commonality such as religion or ethnicity, or an entire community.
- Self-complain is not considered toxic, because it is not harmful to anyone. However, if self-complain is intended to indicate something bad, it will be considered as toxic.
- Both direct and indirect messages including those with sarcasm are taken into consideration.
We strictly instructed all the annotators about these concepts and asked them to perform a small test to ensure they understood these conditions. The annotation process was divided into two rounds. We asked the candidates to annotate their answers in the first round to learn our annotation standard. Then, we asked them to annotate a different dataset and selected the ones who obtained a full-score for the second round as an annotator. From among these annotators, 20% of the annotators failed the first round and were not involved in the final annotation.
#### Who are the annotators?
Three annotators hired by [Sirihattasak et al (2019)](https://www.ta-cos.org/sites/ta-cos.org/files/1_W32.pdf)
### Personal and Sensitive Information
Despite all tweets being public, due to the nature of toxic tweets, there might be personal attacks and toxic language used.
## Considerations for Using the Data
### Social Impact of Dataset
- toxic social media message classification dataset
### Discussion of Biases
- Users are masked before annotation by the annotators to prevent biases based on tweet authors
### Other Known Limitations
- The data is included into [huggingface/datasets](https://www.github.com/huggingface/datasets) in Dec 2020. By this time, 506 of the tweets are not available publicly anymore. We denote these by `TWEET_NOT_FOUND` in `tweet_text`.
## Additional Information
### Dataset Curators
[Sirihattasak et al (2019)](https://www.ta-cos.org/sites/ta-cos.org/files/1_W32.pdf)
### Licensing Information
CC-BY-NC 3.0
### Citation Information
Please cite the following if you make use of the dataset:
```
@article{sirihattasak2019annotation,
title={Annotation and Classification of Toxicity for Thai Twitter},
author={Sirihattasak, Sugan and Komachi, Mamoru and Ishikawa, Hiroshi},
year={2019}
}
```
### Contributions
Thanks to [@cstorm125](https://github.com/cstorm125) for adding this dataset. |
Axel578/OIDyyyyoisfnmsfhsognsdhiogosdnoghros | ---
dataset_info:
features:
- name: image
sequence:
sequence:
sequence: uint8
- name: label
dtype: string
splits:
- name: train
num_bytes: 3016874623
num_examples: 1413831
download_size: 437484293
dataset_size: 3016874623
---
# Dataset Card for "OIDyyyyoisfnmsfhsognsdhiogosdnoghros"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oreva/ppl_gpt2-large_ranked_squad | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: prompt
dtype: string
- name: ppl_gpt2-large
dtype: float64
splits:
- name: train
num_bytes: 138319124
num_examples: 77087
download_size: 86663439
dataset_size: 138319124
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ppl_gpt2-large_ranked_squad"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jbpark0614/speechocean762_train | ---
dataset_info:
features:
- name: index
dtype: int64
- name: speaker_id_str
dtype: int64
- name: speaker_id
dtype: int64
- name: question_id
dtype: int64
- name: total_score
dtype: int64
- name: accuracy
dtype: int64
- name: completeness
dtype: float64
- name: fluency
dtype: int64
- name: prosodic
dtype: int64
- name: text
dtype: string
- name: audio
dtype: audio
- name: path
dtype: string
splits:
- name: train
num_bytes: 290407029.0
num_examples: 2500
download_size: 316008757
dataset_size: 290407029.0
---
# Dataset Card for "speechocean762_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlFrauch/im2latex | ---
task_categories:
- image-to-text
tags:
- code
size_categories:
- 1M<n<10M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is a set of pairs: an image and its corresponding latex code for expression. This set of pairs was generated by analyzing more than 100,000 articles on natural sciences and mathematics and further generating a corresponding set of latex expressions. The set has been cleared of duplicates. There are about 1 500 000 images in the set.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Latex
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
```python
Dataset({
features: ['image', 'text'],
num_rows: 1586584
})
```
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
@misc{alexfrauch_VSU_2023,
title = {Recognition of mathematical formulas in the Latex: Image-Text Pair Dataset},
author = {Aleksandr Frauch (Proshunin)},
year = {2023},
howpublished = {\url{https://huggingface.co/datasets/AlFrauch/im2latex}},
}
### Contributions
[More Information Needed] |
marcuskwan/my_test_data | ---
license: mit
---
|
open-llm-leaderboard/details_lqtrung1998__galactica-6.7b-ReFT-GSM8k | ---
pretty_name: Evaluation run of lqtrung1998/galactica-6.7b-ReFT-GSM8k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lqtrung1998/galactica-6.7b-ReFT-GSM8k](https://huggingface.co/lqtrung1998/galactica-6.7b-ReFT-GSM8k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lqtrung1998__galactica-6.7b-ReFT-GSM8k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-05T00:29:00.141185](https://huggingface.co/datasets/open-llm-leaderboard/details_lqtrung1998__galactica-6.7b-ReFT-GSM8k/blob/main/results_2024-03-05T00-29-00.141185.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3736017404729396,\n\
\ \"acc_stderr\": 0.03419358676137801,\n \"acc_norm\": 0.37887305414992606,\n\
\ \"acc_norm_stderr\": 0.035086364144539854,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156496,\n \"mc2\": 0.4120886476277968,\n\
\ \"mc2_stderr\": 0.014388497221701243\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3643344709897611,\n \"acc_stderr\": 0.014063260279882417,\n\
\ \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009128\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3921529575781717,\n\
\ \"acc_stderr\": 0.004872326888655527,\n \"acc_norm\": 0.5033857797251543,\n\
\ \"acc_norm_stderr\": 0.004989667009372639\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.35260115606936415,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380035,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380035\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101803,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101803\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.46774193548387094,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.46774193548387094,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.43523316062176165,\n \"acc_stderr\": 0.035780381650085846,\n\
\ \"acc_norm\": 0.43523316062176165,\n \"acc_norm_stderr\": 0.035780381650085846\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.02450347255711094,\n \
\ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.02450347255711094\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.03156663099215416,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5137614678899083,\n \"acc_stderr\": 0.021429202089874075,\n \"\
acc_norm\": 0.5137614678899083,\n \"acc_norm_stderr\": 0.021429202089874075\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.38396624472573837,\n \"acc_stderr\": 0.031658678064106674,\n \
\ \"acc_norm\": 0.38396624472573837,\n \"acc_norm_stderr\": 0.031658678064106674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.33183856502242154,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.33183856502242154,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4214876033057851,\n \"acc_stderr\": 0.045077322787750944,\n \"\
acc_norm\": 0.4214876033057851,\n \"acc_norm_stderr\": 0.045077322787750944\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.39814814814814814,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.03642914578292405,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.03642914578292405\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697623,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697623\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4368932038834951,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.4368932038834951,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.36752136752136755,\n\
\ \"acc_stderr\": 0.03158539157745636,\n \"acc_norm\": 0.36752136752136755,\n\
\ \"acc_norm_stderr\": 0.03158539157745636\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3997445721583653,\n\
\ \"acc_stderr\": 0.01751684790705328,\n \"acc_norm\": 0.3997445721583653,\n\
\ \"acc_norm_stderr\": 0.01751684790705328\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.34971098265895956,\n \"acc_stderr\": 0.025674281456531025,\n\
\ \"acc_norm\": 0.34971098265895956,\n \"acc_norm_stderr\": 0.025674281456531025\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.028614624752805407,\n\
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.028614624752805407\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3954983922829582,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.3954983922829582,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.026571483480719974,\n\
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.026571483480719974\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.02624492034984301,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.02624492034984301\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.303129074315515,\n\
\ \"acc_stderr\": 0.011738669951254293,\n \"acc_norm\": 0.303129074315515,\n\
\ \"acc_norm_stderr\": 0.011738669951254293\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3104575163398693,\n \"acc_stderr\": 0.018718067052623223,\n \
\ \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.018718067052623223\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505415,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505415\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.03171752824062664,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.03171752824062664\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48258706467661694,\n\
\ \"acc_stderr\": 0.035333892347392454,\n \"acc_norm\": 0.48258706467661694,\n\
\ \"acc_norm_stderr\": 0.035333892347392454\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987251,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987251\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156496,\n \"mc2\": 0.4120886476277968,\n\
\ \"mc2_stderr\": 0.014388497221701243\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5832675611681136,\n \"acc_stderr\": 0.013856250072796323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.0022675371022544805\n }\n}\n```"
repo_url: https://huggingface.co/lqtrung1998/galactica-6.7b-ReFT-GSM8k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|arc:challenge|25_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|gsm8k|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hellaswag|10_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-29-00.141185.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T00-29-00.141185.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- '**/details_harness|winogrande|5_2024-03-05T00-29-00.141185.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-05T00-29-00.141185.parquet'
- config_name: results
data_files:
- split: 2024_03_05T00_29_00.141185
path:
- results_2024-03-05T00-29-00.141185.parquet
- split: latest
path:
- results_2024-03-05T00-29-00.141185.parquet
---
# Dataset Card for Evaluation run of lqtrung1998/galactica-6.7b-ReFT-GSM8k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lqtrung1998/galactica-6.7b-ReFT-GSM8k](https://huggingface.co/lqtrung1998/galactica-6.7b-ReFT-GSM8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lqtrung1998__galactica-6.7b-ReFT-GSM8k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-05T00:29:00.141185](https://huggingface.co/datasets/open-llm-leaderboard/details_lqtrung1998__galactica-6.7b-ReFT-GSM8k/blob/main/results_2024-03-05T00-29-00.141185.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3736017404729396,
"acc_stderr": 0.03419358676137801,
"acc_norm": 0.37887305414992606,
"acc_norm_stderr": 0.035086364144539854,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.4120886476277968,
"mc2_stderr": 0.014388497221701243
},
"harness|arc:challenge|25": {
"acc": 0.3643344709897611,
"acc_stderr": 0.014063260279882417,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009128
},
"harness|hellaswag|10": {
"acc": 0.3921529575781717,
"acc_stderr": 0.004872326888655527,
"acc_norm": 0.5033857797251543,
"acc_norm_stderr": 0.004989667009372639
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39622641509433965,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.39622641509433965,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380035,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380035
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101803,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101803
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.46774193548387094,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.46774193548387094,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43523316062176165,
"acc_stderr": 0.035780381650085846,
"acc_norm": 0.43523316062176165,
"acc_norm_stderr": 0.035780381650085846
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.02450347255711094,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.02450347255711094
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5137614678899083,
"acc_stderr": 0.021429202089874075,
"acc_norm": 0.5137614678899083,
"acc_norm_stderr": 0.021429202089874075
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.38396624472573837,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.38396624472573837,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.33183856502242154,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.33183856502242154,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.40458015267175573,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.40458015267175573,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4214876033057851,
"acc_stderr": 0.045077322787750944,
"acc_norm": 0.4214876033057851,
"acc_norm_stderr": 0.045077322787750944
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.03642914578292405,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.03642914578292405
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697623,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697623
},
"harness|hendrycksTest-management|5": {
"acc": 0.4368932038834951,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.4368932038834951,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.36752136752136755,
"acc_stderr": 0.03158539157745636,
"acc_norm": 0.36752136752136755,
"acc_norm_stderr": 0.03158539157745636
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3997445721583653,
"acc_stderr": 0.01751684790705328,
"acc_norm": 0.3997445721583653,
"acc_norm_stderr": 0.01751684790705328
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.34971098265895956,
"acc_stderr": 0.025674281456531025,
"acc_norm": 0.34971098265895956,
"acc_norm_stderr": 0.025674281456531025
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805407,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805407
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3954983922829582,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.3954983922829582,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.026571483480719974,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.026571483480719974
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.02624492034984301,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.02624492034984301
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.303129074315515,
"acc_stderr": 0.011738669951254293,
"acc_norm": 0.303129074315515,
"acc_norm_stderr": 0.011738669951254293
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3104575163398693,
"acc_stderr": 0.018718067052623223,
"acc_norm": 0.3104575163398693,
"acc_norm_stderr": 0.018718067052623223
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505415,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505415
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.03171752824062664,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.03171752824062664
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48258706467661694,
"acc_stderr": 0.035333892347392454,
"acc_norm": 0.48258706467661694,
"acc_norm_stderr": 0.035333892347392454
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987251,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987251
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.4120886476277968,
"mc2_stderr": 0.014388497221701243
},
"harness|winogrande|5": {
"acc": 0.5832675611681136,
"acc_stderr": 0.013856250072796323
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544805
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lukaemon/bbh | ---
dataset_info:
- config_name: boolean_expressions
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 11790
num_examples: 250
download_size: 17172
dataset_size: 11790
- config_name: causal_judgement
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 198021
num_examples: 187
download_size: 202943
dataset_size: 198021
- config_name: date_understanding
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 54666
num_examples: 250
download_size: 61760
dataset_size: 54666
- config_name: disambiguation_qa
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 78620
num_examples: 250
download_size: 85255
dataset_size: 78620
- config_name: dyck_languages
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 38432
num_examples: 250
download_size: 43814
dataset_size: 38432
- config_name: formal_fallacies
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 138224
num_examples: 250
download_size: 145562
dataset_size: 138224
- config_name: geometric_shapes
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 68560
num_examples: 250
download_size: 77242
dataset_size: 68560
- config_name: hyperbaton
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 38574
num_examples: 250
download_size: 44706
dataset_size: 38574
- config_name: logical_deduction_five_objects
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 148595
num_examples: 250
download_size: 155477
dataset_size: 148595
- config_name: logical_deduction_seven_objects
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 191022
num_examples: 250
download_size: 198404
dataset_size: 191022
- config_name: logical_deduction_three_objects
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 105831
num_examples: 250
download_size: 112213
dataset_size: 105831
- config_name: movie_recommendation
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 50985
num_examples: 250
download_size: 57684
dataset_size: 50985
- config_name: multistep_arithmetic_two
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 12943
num_examples: 250
download_size: 18325
dataset_size: 12943
- config_name: navigate
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 49031
num_examples: 250
download_size: 55163
dataset_size: 49031
- config_name: object_counting
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 30508
num_examples: 250
download_size: 35890
dataset_size: 30508
- config_name: penguins_in_a_table
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 70062
num_examples: 146
download_size: 74516
dataset_size: 70062
- config_name: reasoning_about_colored_objects
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 89579
num_examples: 250
download_size: 98694
dataset_size: 89579
- config_name: ruin_names
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 46537
num_examples: 250
download_size: 53178
dataset_size: 46537
- config_name: salient_translation_error_detection
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 277110
num_examples: 250
download_size: 286443
dataset_size: 277110
- config_name: snarks
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 38223
num_examples: 178
download_size: 42646
dataset_size: 38223
- config_name: sports_understanding
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 22723
num_examples: 250
download_size: 28617
dataset_size: 22723
- config_name: temporal_sequences
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 139546
num_examples: 250
download_size: 148176
dataset_size: 139546
- config_name: tracking_shuffled_objects_five_objects
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 162590
num_examples: 250
download_size: 169722
dataset_size: 162590
- config_name: tracking_shuffled_objects_seven_objects
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 207274
num_examples: 250
download_size: 214906
dataset_size: 207274
- config_name: tracking_shuffled_objects_three_objects
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 122104
num_examples: 250
download_size: 128736
dataset_size: 122104
- config_name: web_of_lies
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 47582
num_examples: 250
download_size: 52964
dataset_size: 47582
- config_name: word_sorting
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: test
num_bytes: 60918
num_examples: 250
download_size: 66300
dataset_size: 60918
---
# BIG-bench Hard dataset
homepage: https://github.com/suzgunmirac/BIG-Bench-Hard
```
@article{suzgun2022challenging,
title={Challenging BIG-Bench Tasks and Whether Chain-of-Thought Can Solve Them},
author={Suzgun, Mirac and Scales, Nathan and Sch{\"a}rli, Nathanael and Gehrmann, Sebastian and Tay, Yi and Chung, Hyung Won and Chowdhery, Aakanksha and Le, Quoc V and Chi, Ed H and Zhou, Denny and and Wei, Jason},
journal={arXiv preprint arXiv:2210.09261},
year={2022}
}
``` |
juliozhao/SHR | ---
license: apache-2.0
---
|
NetherlandsForensicInstitute/coco-captions-translated-nl | ---
license: cc-by-sa-4.0
viewer: true
task_categories:
- sentence-similarity
language:
- nl
size_categories:
- 100K<n<1M
---
This is a Dutch version of the [Coco captions](https://cocodataset.org/#download) dataset. Which we have auto-translated from English into Dutch using Meta's [No Language Left Behind](https://ai.facebook.com/research/no-language-left-behind/) model, specifically the [huggingface implementation](https://huggingface.co/facebook/nllb-200-distilled-600M). |
gayanin/pubmed-abstracts-noised-with-prob-dist-v2 | ---
dataset_info:
- config_name: babylon-prob-01
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 6298703
num_examples: 24908
- name: test
num_bytes: 794582
num_examples: 3113
- name: validation
num_bytes: 784437
num_examples: 3114
download_size: 4438345
dataset_size: 7877722
- config_name: babylon-prob-02
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 6131860
num_examples: 24908
- name: test
num_bytes: 772976
num_examples: 3113
- name: validation
num_bytes: 763170
num_examples: 3114
download_size: 4431105
dataset_size: 7668006
- config_name: babylon-prob-03
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5963382
num_examples: 24908
- name: test
num_bytes: 751530
num_examples: 3113
- name: validation
num_bytes: 743139
num_examples: 3114
download_size: 4411104
dataset_size: 7458051
- config_name: babylon-prob-04
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5794478
num_examples: 24908
- name: test
num_bytes: 730929
num_examples: 3113
- name: validation
num_bytes: 720849
num_examples: 3114
download_size: 4374101
dataset_size: 7246256
- config_name: babylon-prob-05
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5634718
num_examples: 24908
- name: test
num_bytes: 708651
num_examples: 3113
- name: validation
num_bytes: 701862
num_examples: 3114
download_size: 4336094
dataset_size: 7045231
- config_name: gcd-prob-01
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5623412
num_examples: 24908
- name: test
num_bytes: 774353
num_examples: 3114
- name: validation
num_bytes: 772363
num_examples: 3114
download_size: 4026552
dataset_size: 7170128
- config_name: gcd-prob-02
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5386733
num_examples: 24908
- name: test
num_bytes: 742236
num_examples: 3114
- name: validation
num_bytes: 739965
num_examples: 3114
download_size: 3926230
dataset_size: 6868934
- config_name: gcd-prob-03
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5151749
num_examples: 24908
- name: test
num_bytes: 709209
num_examples: 3114
- name: validation
num_bytes: 706547
num_examples: 3114
download_size: 3806924
dataset_size: 6567505
- config_name: gcd-prob-04
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 4914469
num_examples: 24908
- name: test
num_bytes: 678027
num_examples: 3114
- name: validation
num_bytes: 676635
num_examples: 3114
download_size: 3674828
dataset_size: 6269131
- config_name: gcd-prob-05
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 4682536
num_examples: 24908
- name: test
num_bytes: 643943
num_examples: 3114
- name: validation
num_bytes: 644068
num_examples: 3114
download_size: 3536779
dataset_size: 5970547
- config_name: kaggle-prob-01
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 6254746
num_examples: 24908
- name: test
num_bytes: 787330
num_examples: 3113
- name: validation
num_bytes: 783533
num_examples: 3114
download_size: 4393817
dataset_size: 7825609
- config_name: kaggle-prob-02
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 6002616
num_examples: 24908
- name: test
num_bytes: 753845
num_examples: 3113
- name: validation
num_bytes: 751722
num_examples: 3114
download_size: 4291924
dataset_size: 7508183
- config_name: kaggle-prob-03
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5747484
num_examples: 24908
- name: test
num_bytes: 722481
num_examples: 3113
- name: validation
num_bytes: 719629
num_examples: 3114
download_size: 4175521
dataset_size: 7189594
- config_name: kaggle-prob-04
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5496897
num_examples: 24908
- name: test
num_bytes: 692009
num_examples: 3113
- name: validation
num_bytes: 688458
num_examples: 3114
download_size: 4054340
dataset_size: 6877364
- config_name: kaggle-prob-05
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 5243270
num_examples: 24908
- name: test
num_bytes: 658650
num_examples: 3113
- name: validation
num_bytes: 658178
num_examples: 3114
download_size: 3911586
dataset_size: 6560098
configs:
- config_name: babylon-prob-01
data_files:
- split: train
path: babylon-prob-01/train-*
- split: test
path: babylon-prob-01/test-*
- split: validation
path: babylon-prob-01/validation-*
- config_name: babylon-prob-02
data_files:
- split: train
path: babylon-prob-02/train-*
- split: test
path: babylon-prob-02/test-*
- split: validation
path: babylon-prob-02/validation-*
- config_name: babylon-prob-03
data_files:
- split: train
path: babylon-prob-03/train-*
- split: test
path: babylon-prob-03/test-*
- split: validation
path: babylon-prob-03/validation-*
- config_name: babylon-prob-04
data_files:
- split: train
path: babylon-prob-04/train-*
- split: test
path: babylon-prob-04/test-*
- split: validation
path: babylon-prob-04/validation-*
- config_name: babylon-prob-05
data_files:
- split: train
path: babylon-prob-05/train-*
- split: test
path: babylon-prob-05/test-*
- split: validation
path: babylon-prob-05/validation-*
- config_name: gcd-prob-01
data_files:
- split: train
path: gcd-prob-01/train-*
- split: test
path: gcd-prob-01/test-*
- split: validation
path: gcd-prob-01/validation-*
- config_name: gcd-prob-02
data_files:
- split: train
path: gcd-prob-02/train-*
- split: test
path: gcd-prob-02/test-*
- split: validation
path: gcd-prob-02/validation-*
- config_name: gcd-prob-03
data_files:
- split: train
path: gcd-prob-03/train-*
- split: test
path: gcd-prob-03/test-*
- split: validation
path: gcd-prob-03/validation-*
- config_name: gcd-prob-04
data_files:
- split: train
path: gcd-prob-04/train-*
- split: test
path: gcd-prob-04/test-*
- split: validation
path: gcd-prob-04/validation-*
- config_name: gcd-prob-05
data_files:
- split: train
path: gcd-prob-05/train-*
- split: test
path: gcd-prob-05/test-*
- split: validation
path: gcd-prob-05/validation-*
- config_name: kaggle-prob-01
data_files:
- split: train
path: kaggle-prob-01/train-*
- split: test
path: kaggle-prob-01/test-*
- split: validation
path: kaggle-prob-01/validation-*
- config_name: kaggle-prob-02
data_files:
- split: train
path: kaggle-prob-02/train-*
- split: test
path: kaggle-prob-02/test-*
- split: validation
path: kaggle-prob-02/validation-*
- config_name: kaggle-prob-03
data_files:
- split: train
path: kaggle-prob-03/train-*
- split: test
path: kaggle-prob-03/test-*
- split: validation
path: kaggle-prob-03/validation-*
- config_name: kaggle-prob-04
data_files:
- split: train
path: kaggle-prob-04/train-*
- split: test
path: kaggle-prob-04/test-*
- split: validation
path: kaggle-prob-04/validation-*
- config_name: kaggle-prob-05
data_files:
- split: train
path: kaggle-prob-05/train-*
- split: test
path: kaggle-prob-05/test-*
- split: validation
path: kaggle-prob-05/validation-*
---
|
Marcis/Fleetway_Super_Sonic | ---
license: openrail
---
|
open-llm-leaderboard/details_feidfoe__Metamath-reproduce-7b | ---
pretty_name: Evaluation run of feidfoe/Metamath-reproduce-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [feidfoe/Metamath-reproduce-7b](https://huggingface.co/feidfoe/Metamath-reproduce-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_feidfoe__Metamath-reproduce-7b\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T14:22:23.494556](https://huggingface.co/datasets/open-llm-leaderboard/details_feidfoe__Metamath-reproduce-7b/blob/main/results_2023-12-02T14-22-23.494556.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5815011372251706,\n\
\ \"acc_stderr\": 0.013588287284030881\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.5815011372251706,\n \"acc_stderr\": 0.013588287284030881\n\
\ }\n}\n```"
repo_url: https://huggingface.co/feidfoe/Metamath-reproduce-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T14_22_23.494556
path:
- '**/details_harness|gsm8k|5_2023-12-02T14-22-23.494556.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T14-22-23.494556.parquet'
- config_name: results
data_files:
- split: 2023_12_02T14_22_23.494556
path:
- results_2023-12-02T14-22-23.494556.parquet
- split: latest
path:
- results_2023-12-02T14-22-23.494556.parquet
---
# Dataset Card for Evaluation run of feidfoe/Metamath-reproduce-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/feidfoe/Metamath-reproduce-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [feidfoe/Metamath-reproduce-7b](https://huggingface.co/feidfoe/Metamath-reproduce-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_feidfoe__Metamath-reproduce-7b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T14:22:23.494556](https://huggingface.co/datasets/open-llm-leaderboard/details_feidfoe__Metamath-reproduce-7b/blob/main/results_2023-12-02T14-22-23.494556.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5815011372251706,
"acc_stderr": 0.013588287284030881
},
"harness|gsm8k|5": {
"acc": 0.5815011372251706,
"acc_stderr": 0.013588287284030881
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
result-kand2-sdxl-wuerst-karlo/2dde8d06 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 169
num_examples: 10
download_size: 1357
dataset_size: 169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "2dde8d06"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
teleprint-me/phi-1 | ---
title: 'Phi-1 Model Dataset'
date: '2023-07-03'
license: cc-by-nc-sa-3.0
---
## Dataset Description
- **Homepage:** [teleprint.me](https://teleprint.me)
- **Repository:** [phi-1](https://huggingface.co/datasets/teleprint-me/phi-1)
- **Paper:** [2306.11644v1](https://arxiv.org/abs/2306.11644v1)
- **Leaderboard:** [Link to the leaderboard]
- **Point of Contact:** [aberrio@teleprint.me](aberrio@teleprint.me)
### Dataset Summary
This dataset is created for training the phi-1 model, based on the paper
"Textbooks are All You Need". It contains high-quality data derived from various
textbooks, transformed and synthesized using OpenAI's GPT-3.5 and GPT-4 models.
For optimal results, it is recommended to train models with the following
parameters and sequence lengths:
- For a model with 350M parameters, use a sequence length of 2048.
- For a model with 700M parameters, use a sequence length of 4096.
- For a model with 1.3B parameters, use a sequence length of 8096.
Please note that the dataset is currently in its initial phase of planning and
collection. The process involves preparing the data, extracting it, formatting
it, chunking it, and preparing it for synthesis. Scripts for preparing and
processing the data for the model will be developed. Once the data is generated,
it will undergo a review and revision process to ensure its quality and
relevance.
These recommendations and notes are based on the dataset creator's initial plans
and may be subject to change as the project progresses.
**NOTE**: Due to the nature of this dataset, it cannot be released without
obtaining permissions from the respective publishers and/or authors. If you are
an author or publisher and have any concerns about this repository, please feel
free to email me.
If you are an author or publisher and would like to grant permission for the use
of your work, your support would be greatly appreciated. Please note that in
order for the dataset to be released, permissions would need to be unanimous
from all involved parties.
In the absence of such permissions, I will respect the copyrights of the
copyrighted materials and exercise my right to Fair Use with my own physical
property for personal use.
**This dataset is NOT intended for commercial purposes**. Its primary purpose is
for research in machine learning and AI software development. If a model is
created using this dataset, it will be shared under the same license.
Any proceeds derived from donations will be primarily used for the development
of the dataset and the model.
### Supported Tasks and Leaderboards
- `text-generation`: The dataset can be used to train a model for chat-like text
generation, more specifically, for generating explanations and examples in the
context of arithmetic, algebra, geometry, trigonometry, calculus, algorithms
and data structures, design patterns, and the python programming language.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
A data instance consists of a dialogue between a user and an assistant,
discussing a topic in arithmetic, algebra, geometry, trigonometry, calculus,
algorithms and data structures, design patterns, or the Python programming
language. The dialogue is structured as a list of turns, each turn containing
the role ("user" or "assistant") and the content of the turn.
### Data Fields
- `role`: a string indicating the role of the speaker in the dialogue ("system",
"user", "assistant", "function").
- `content`: a string containing the content of the speaker's turn in the
dialogue.
### Data Splits
The dataset is split into a training set, a validation set, and a test set. The
exact sizes and proportions of these splits will depend on the final size of the
dataset.
## Dataset Creation
### Curation Rationale
The dataset is being created to train a model capable of generating explanations
and examples in the context of various mathematical and computer science topics.
The goal is to create an AI assistant that can provide clear, accurate, and
pedagogically sound responses to user queries on these topics.
### Source Data
#### Initial Data Collection and Normalization
The data is collected from a variety of textbooks covering arithmetic, algebra,
geometry, trigonometry, calculus, algorithms and data structures, design
patterns, and the Python programming language. The textbooks used include:
- Barron's Arithmetic The Easy Way Fourth Edition
- Blitzer Introductory Algebra for College Students Fifth Edition
- McDougal Littell Geometry
- Blitzer Intermediate Algebra for College Students 5th Edition
- Trigonometry Sixth Edition
- Pearson College Algebra Fourth Edition
- Hughes-Hallet Applied Calculus 5th Edition
- CLRS Introduction to Algorithms Third Edition
In addition to the textbooks, the dataset also includes material from the
following online resources:
- [C reference](https://en.cppreference.com/w/c)
- [Cpp reference](https://en.cppreference.com/w/cpp)
- [Python Standard Library](https://docs.python.org/3/)
These resources provide up-to-date information and examples for the C, C++, and
Python programming languages. The creators of the Cppreference site also provide
[archives](https://en.cppreference.com/w/Cppreference:Archives) of their site
for offline use. Code samples synthesized by OpenAI's GPT models, curated by the
dataset creator, are also included in the dataset.
**Note:** The creator of this dataset owns physical copies of all the textbooks
listed above. The data from these sources are transformed into a dialogue format
using OpenAI's GPT-3.5 and GPT-4 models. The resulting dialogues are then used
as the training data for the phi-1 model. This dataset does not include the full
content of the source textbooks. Instead, it consists of transformations and
syntheses of the original content. Anyone who wants access to the full original
content should purchase or otherwise legally access the textbooks themselves.
#### Who are the source language producers?
The original language data was created by a variety of authors and educators,
who wrote the textbooks and other materials used as sources for this dataset.
These include:
- Barron's Arithmetic The Easy Way Fourth Edition - Edward Williams, Katie
Prindle
- Blitzer Introductory Algebra for College Students Fifth Edition - Robert
Blitzer
- McDougal Littell Geometry - Ron Larson, Laurie Boswell, Timothy D. Kanold, Lee
Stiff
- Blitzer Intermediate Algebra for College Students 5th Edition - Robert Blitzer
- Trigonometry Sixth Edition - Charles P. McKeague, Mark D. Turner
- Pearson College Algebra Fourth Edition - Robert F. Blitzer
- Hughes-Hallet Applied Calculus 5th Edition - Deborah Hughes-Hallett, Andrew M.
Gleason, Patti Frazer Lock, Daniel E. Flath, Sheldon P. Gordon, David O.
Lomen, David Lovelock, William G. McCallum, Brad G. Osgood, Andrew Pasquale,
Jeff Tecosky-Feldman, Joseph Thrash, Karen R. Rhea, Thomas W. Tucker
- CLRS Introduction to Algorithms Third Edition - Thomas H. Cormen, Charles E.
Leiserson, Ronald L. Rivest, Clifford Stein
In addition to these authors, the developers of OpenAI's GPT-3.5 and GPT-4
models also contributed to the creation of the language data, as these models
were used to transform the source material into a dialogue format.
### Annotations
#### Annotation process
The dataset does not contain any explicit annotations. However, the data is
curated and synthesized using OpenAI's GPT-3.5 and GPT-4 models. The process
involves transforming the source material into a dialogue format suitable for
training the phi-1 model. The dataset creator, an independent learner with a
strong interest in computer science, reviewed and curated the synthesized
dialogues to ensure their quality and relevance.
#### Who are the annotators?
The dataset creator, an independent learner who has studied computer science
extensively in a self-directed manner, performed the curation and review of the
synthesized dialogues.
### Personal and Sensitive Information
The dataset does not contain any personal or sensitive information. All the data
is derived from publicly available textbooks and online resources. Any names or
other potential identifiers in the source material have been removed or
anonymized.
### Social Impact of Dataset
The dataset is intended to support the development of AI models capable of
providing detailed explanations and examples in the context of arithmetic,
algebra, geometry, trigonometry, calculus, algorithms and data structures,
design patterns, and the python programming language. The potential social
impact is significant, as such models could greatly enhance self-directed
learning and provide valuable educational support to students worldwide.
However, it's important to note that the quality and usefulness of the AI models
trained on this dataset will depend on the quality of the data itself. If the
data is inaccurate or biased, the models could propagate these inaccuracies and
biases, potentially leading to misinformation or unfair outcomes.
### Discussion of Biases
The dataset is based on a variety of textbooks and online resources, which may
contain their own inherent biases. For example, textbooks often reflect the
perspectives and biases of their authors, which can influence the way
information is presented. These biases could potentially be reflected in the
dataset and in any models trained on it.
### Other Known Limitations
At this stage of the dataset creation process, it's difficult to identify all
potential limitations. However, one potential limitation is that the dataset may
not cover all possible topics or perspectives within the fields it addresses.
The dataset creator will continue to monitor and assess the dataset for
limitations as the work progresses.
## Additional Information
### Dataset Curators
The dataset was curated by an independent learner with a strong interest in
computer science. The curator has studied the subject matter in a self-directed
manner, using a variety of resources including textbooks and online materials.
The curation process also involved the use of OpenAI's GPT-3.5 and GPT-4 models
to synthesize dialogues based on the source material.
### Licensing Information
This dataset is released under the Creative Commons
Attribution-NonCommercial-ShareAlike 3.0 International (CC BY-NC-SA 3.0)
license.
### Citation Information
As this dataset is a compilation of various sources synthesized and curated for
the purpose of training the phi-1 model, please ensure to cite the original
sources when using this dataset. If referencing the dataset directly, please
refer to this repository.
|
huggingartists/lazy-jay | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/lazy-jay"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.039845 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/c3045337575e2ce646bbc54369de4143.450x427x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/lazy-jay">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lazy Jay</div>
<a href="https://genius.com/artists/lazy-jay">
<div style="text-align: center; font-size: 14px;">@lazy-jay</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/lazy-jay).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lazy-jay")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|6| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/lazy-jay")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
indolem/IndoCulture | ---
license: cc-by-nc-sa-4.0
---
|
kosta-naumenko/medflex | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 2574069
num_examples: 1934
download_size: 314783
dataset_size: 2574069
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "medflex"
dataset = load_dataset("kosta-naumenko/medflex", split='train', download_mode='force_redownload', verification_mode='no_checks')
'tokens' - список списков слов предложений (is_split_into_words=True при токенизации)
'ner_tags' - список списков классов слов
- 0 - не симптом
- 1 - начало симптома
- 2 - продолжение симптома
Пример дальнейшей обработки - https://huggingface.co/learn/nlp-course/chapter7/2
|
open-llm-leaderboard/details_amu__dpo-phi2 | ---
pretty_name: Evaluation run of amu/dpo-phi2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [amu/dpo-phi2](https://huggingface.co/amu/dpo-phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amu__dpo-phi2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T22:52:41.834873](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__dpo-phi2/blob/main/results_2024-02-09T22-52-41.834873.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5828070162053215,\n\
\ \"acc_stderr\": 0.03369036649487999,\n \"acc_norm\": 0.5845127625459068,\n\
\ \"acc_norm_stderr\": 0.03437729917800213,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4398875544767273,\n\
\ \"mc2_stderr\": 0.015069641700788115\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.01440561827943618,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672874\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5633339972117108,\n\
\ \"acc_stderr\": 0.004949589567678895,\n \"acc_norm\": 0.7513443537143996,\n\
\ \"acc_norm_stderr\": 0.004313503876346087\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.042849586397533994,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.042849586397533994\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.040089737857792046,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.040089737857792046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456344,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253837,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253837\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671742,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616265,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616265\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630797,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630797\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6883780332056194,\n\
\ \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.6883780332056194,\n\
\ \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.02519018132760842,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.02519018132760842\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n\
\ \"acc_stderr\": 0.015491756531894638,\n \"acc_norm\": 0.311731843575419,\n\
\ \"acc_norm_stderr\": 0.015491756531894638\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347817,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347817\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885998,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4398875544767273,\n\
\ \"mc2_stderr\": 0.015069641700788115\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972392\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5443517816527672,\n \
\ \"acc_stderr\": 0.013718194542485601\n }\n}\n```"
repo_url: https://huggingface.co/amu/dpo-phi2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-52-41.834873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-52-41.834873.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- '**/details_harness|winogrande|5_2024-02-09T22-52-41.834873.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T22-52-41.834873.parquet'
- config_name: results
data_files:
- split: 2024_02_09T22_52_41.834873
path:
- results_2024-02-09T22-52-41.834873.parquet
- split: latest
path:
- results_2024-02-09T22-52-41.834873.parquet
---
# Dataset Card for Evaluation run of amu/dpo-phi2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [amu/dpo-phi2](https://huggingface.co/amu/dpo-phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amu__dpo-phi2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T22:52:41.834873](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__dpo-phi2/blob/main/results_2024-02-09T22-52-41.834873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5828070162053215,
"acc_stderr": 0.03369036649487999,
"acc_norm": 0.5845127625459068,
"acc_norm_stderr": 0.03437729917800213,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4398875544767273,
"mc2_stderr": 0.015069641700788115
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.01440561827943618,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672874
},
"harness|hellaswag|10": {
"acc": 0.5633339972117108,
"acc_stderr": 0.004949589567678895,
"acc_norm": 0.7513443537143996,
"acc_norm_stderr": 0.004313503876346087
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.042849586397533994,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.042849586397533994
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.040089737857792046,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.040089737857792046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456344,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253837,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253837
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671742,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616265,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616265
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630797,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652265,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6883780332056194,
"acc_stderr": 0.016562433867284176,
"acc_norm": 0.6883780332056194,
"acc_norm_stderr": 0.016562433867284176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.02519018132760842,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.02519018132760842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894638,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894638
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347817,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347817
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885998,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4398875544767273,
"mc2_stderr": 0.015069641700788115
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972392
},
"harness|gsm8k|5": {
"acc": 0.5443517816527672,
"acc_stderr": 0.013718194542485601
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pkr7098/fNIRS | ---
license: cc-by-4.0
---
# Dataset Card for "fNIRS"
* fNIRS_data.pkl contains only data not label (numpy.array)
* fNIRS_label.pkl contains only label (numpy.array)
* fNIRS_dataset.pkl contains data (numpy.array) and label (numpy.array) like:
```bash
{
'train': {
'sub_01': {
'data':
'label'
}
},
'val':
'test'
}
```
# Information
* The dataset has 82620 samples with 8 channels and 200 time sequences.
* The number of classes is 4 (0, 1, 2, 3)
This dataset is from https://tufts-hci-lab.github.io/code_and_datasets/fNIRS2MW.html |
Adongua/autotrain-data-test-sa-gam | ---
language:
- en
task_categories:
- summarization
---
# AutoTrain Dataset for project: test-sa-gam
## Dataset Description
This dataset has been automatically processed by AutoTrain for project test-sa-gam.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "It is easy to navigate and update programs",
"target": "[([6, 7], [2]), ([4], [2])]"
},
{
"text": "The big screen allows you to enjoy watching movies , pictures and etc",
"target": "[([2], [1])]"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1016 |
| valid | 112 |
|
xyaoaf/ESPM288 | ---
license: pddl
---
|
tyzhu/squad_wrong_id_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 237881
num_examples: 150
- name: validation
num_bytes: 59884
num_examples: 48
download_size: 28458
dataset_size: 297765
---
# Dataset Card for "squad_wrong_id_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/clinic-meta | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 65857.4
num_examples: 1050
- name: test
num_bytes: 28224.6
num_examples: 450
download_size: 0
dataset_size: 94082.0
---
```
@inproceedings{larson-etal-2019-evaluation,
title = "An Evaluation Dataset for Intent Classification and Out-of-Scope Prediction",
author = "Larson, Stefan and
Mahendran, Anish and
Peper, Joseph J. and
Clarke, Christopher and
Lee, Andrew and
Hill, Parker and
Kummerfeld, Jonathan K. and
Leach, Kevin and
Laurenzano, Michael A. and
Tang, Lingjia and
Mars, Jason",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
year = "2019",
url = "https://www.aclweb.org/anthology/D19-1131"
}
``` |
nanyy1025/covid_fake_news | ---
task_categories:
- text-classification
- zero-shot-classification
language:
- en
---
Constraint@AAAI2021 - COVID19 Fake News Detection in English
```
@misc{patwa2020fighting,
title={Fighting an Infodemic: COVID-19 Fake News Dataset},
author={Parth Patwa and Shivam Sharma and Srinivas PYKL and Vineeth Guptha and Gitanjali Kumari and Md Shad Akhtar and Asif Ekbal and Amitava Das and Tanmoy Chakraborty},
year={2020},
eprint={2011.03327},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
open-llm-leaderboard/details_MaziyarPanahi__Bioxtral-4x7B-v0.1 | ---
pretty_name: Evaluation run of MaziyarPanahi/Bioxtral-4x7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Bioxtral-4x7B-v0.1](https://huggingface.co/MaziyarPanahi/Bioxtral-4x7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Bioxtral-4x7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T03:03:06.477232](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Bioxtral-4x7B-v0.1/blob/main/results_2024-03-01T03-03-06.477232.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6390815384774987,\n\
\ \"acc_stderr\": 0.03233527173865626,\n \"acc_norm\": 0.6405373328568302,\n\
\ \"acc_norm_stderr\": 0.032994557880045274,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6845419346695587,\n\
\ \"mc2_stderr\": 0.014829461272743373\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.01385583128749772,\n\
\ \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068079\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6946823341963753,\n\
\ \"acc_stderr\": 0.004596006250433548,\n \"acc_norm\": 0.8727345150368453,\n\
\ \"acc_norm_stderr\": 0.003325890225529856\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742397,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742397\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406786,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406786\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02390115797940254,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02390115797940254\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039931,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039931\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n\
\ \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464074,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47374301675977654,\n\
\ \"acc_stderr\": 0.016699427672784768,\n \"acc_norm\": 0.47374301675977654,\n\
\ \"acc_norm_stderr\": 0.016699427672784768\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495033,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495033\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\
\ \"acc_stderr\": 0.012640625443067358,\n \"acc_norm\": 0.42894393741851367,\n\
\ \"acc_norm_stderr\": 0.012640625443067358\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724507,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724507\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6845419346695587,\n\
\ \"mc2_stderr\": 0.014829461272743373\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962524\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \
\ \"acc_stderr\": 0.013650728047064688\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Bioxtral-4x7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|arc:challenge|25_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|gsm8k|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hellaswag|10_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-03-06.477232.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T03-03-06.477232.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- '**/details_harness|winogrande|5_2024-03-01T03-03-06.477232.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T03-03-06.477232.parquet'
- config_name: results
data_files:
- split: 2024_03_01T03_03_06.477232
path:
- results_2024-03-01T03-03-06.477232.parquet
- split: latest
path:
- results_2024-03-01T03-03-06.477232.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Bioxtral-4x7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Bioxtral-4x7B-v0.1](https://huggingface.co/MaziyarPanahi/Bioxtral-4x7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Bioxtral-4x7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T03:03:06.477232](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Bioxtral-4x7B-v0.1/blob/main/results_2024-03-01T03-03-06.477232.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6390815384774987,
"acc_stderr": 0.03233527173865626,
"acc_norm": 0.6405373328568302,
"acc_norm_stderr": 0.032994557880045274,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.6845419346695587,
"mc2_stderr": 0.014829461272743373
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.01385583128749772,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.013592431519068079
},
"harness|hellaswag|10": {
"acc": 0.6946823341963753,
"acc_stderr": 0.004596006250433548,
"acc_norm": 0.8727345150368453,
"acc_norm_stderr": 0.003325890225529856
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742397,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742397
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322605,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406786,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02390115797940254,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02390115797940254
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.01659525971039931,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.01659525971039931
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464074,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47374301675977654,
"acc_stderr": 0.016699427672784768,
"acc_norm": 0.47374301675977654,
"acc_norm_stderr": 0.016699427672784768
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495033,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495033
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067358,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067358
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724507,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724507
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.6845419346695587,
"mc2_stderr": 0.014829461272743373
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962524
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064688
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zeyneppktemm/deneme | ---
dataset_info:
features:
- name: tag
dtype: string
- name: patterns
dtype: string
splits:
- name: train
num_bytes: 62948.13373860182
num_examples: 888
- name: test
num_bytes: 7017.866261398176
num_examples: 99
download_size: 26192
dataset_size: 69966.0
---
# Dataset Card for "deneme"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Parikshith/grow-1-monolingual-ha-en-comet-wmt21 | ---
dataset_info:
features:
- name: src
dtype: string
- name: mt
dtype: string
- name: score
dtype: float64
splits:
- name: small
num_bytes: 24923873
num_examples: 100000
download_size: 16395230
dataset_size: 24923873
configs:
- config_name: default
data_files:
- split: small
path: data/small-*
---
|
yelp_review_full | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: YelpReviewFull
license_details: yelp-licence
dataset_info:
config_name: yelp_review_full
features:
- name: label
dtype:
class_label:
names:
'0': 1 star
'1': 2 star
'2': 3 stars
'3': 4 stars
'4': 5 stars
- name: text
dtype: string
splits:
- name: train
num_bytes: 483811554
num_examples: 650000
- name: test
num_bytes: 37271188
num_examples: 50000
download_size: 322952369
dataset_size: 521082742
configs:
- config_name: yelp_review_full
data_files:
- split: train
path: yelp_review_full/train-*
- split: test
path: yelp_review_full/test-*
default: true
train-eval-index:
- config: yelp_review_full
task: text-classification
task_id: multi_class_classification
splits:
train_split: train
eval_split: test
col_mapping:
text: text
label: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
---
# Dataset Card for YelpReviewFull
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Yelp](https://www.yelp.com/dataset)
- **Repository:** [Crepe](https://github.com/zhangxiangxiao/Crepe)
- **Paper:** [Character-level Convolutional Networks for Text Classification](https://arxiv.org/abs/1509.01626)
- **Point of Contact:** [Xiang Zhang](mailto:xiang.zhang@nyu.edu)
### Dataset Summary
The Yelp reviews dataset consists of reviews from Yelp.
It is extracted from the Yelp Dataset Challenge 2015 data.
### Supported Tasks and Leaderboards
- `text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the text, predict the sentiment.
### Languages
The reviews were mainly written in english.
## Dataset Structure
### Data Instances
A typical data point, comprises of a text and the corresponding label.
An example from the YelpReviewFull test set looks as follows:
```
{
'label': 0,
'text': 'I got \'new\' tires from them and within two weeks got a flat. I took my car to a local mechanic to see if i could get the hole patched, but they said the reason I had a flat was because the previous patch had blown - WAIT, WHAT? I just got the tire and never needed to have it patched? This was supposed to be a new tire. \\nI took the tire over to Flynn\'s and they told me that someone punctured my tire, then tried to patch it. So there are resentful tire slashers? I find that very unlikely. After arguing with the guy and telling him that his logic was far fetched he said he\'d give me a new tire \\"this time\\". \\nI will never go back to Flynn\'s b/c of the way this guy treated me and the simple fact that they gave me a used tire!'
}
```
### Data Fields
- 'text': The review texts are escaped using double quotes ("), and any internal double quote is escaped by 2 double quotes (""). New lines are escaped by a backslash followed with an "n" character, that is "\n".
- 'label': Corresponds to the score associated with the review (between 1 and 5).
### Data Splits
The Yelp reviews full star dataset is constructed by randomly taking 130,000 training samples and 10,000 testing samples for each review star from 1 to 5.
In total there are 650,000 trainig samples and 50,000 testing samples.
## Dataset Creation
### Curation Rationale
The Yelp reviews full star dataset is constructed by Xiang Zhang (xiang.zhang@nyu.edu) from the Yelp Dataset Challenge 2015. It is first used as a text classification benchmark in the following paper: Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
You can check the official [yelp-dataset-agreement](https://s3-media3.fl.yelpcdn.com/assets/srv0/engineering_pages/bea5c1e92bf3/assets/vendor/yelp-dataset-agreement.pdf).
### Citation Information
Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
### Contributions
Thanks to [@hfawaz](https://github.com/hfawaz) for adding this dataset. |
Francesco/marbles | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': marbles
'1': red
'2': white
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: marbles
tags:
- rf100
---
# Dataset Card for marbles
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/marbles
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
marbles
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/marbles
### Citation Information
```
@misc{ marbles,
title = { marbles Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/marbles } },
url = { https://universe.roboflow.com/object-detection/marbles },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
fenffef/cmnli | ---
license: mit
---
|
RUCAIBox/Task-Dialogue | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- conversational
task_ids:
- dialogue-generation
tags:
- dialogue-response-generation
- task-dialogue
- dialog-response-generation
---
This is the task dialogue datasets collected by TextBox, including:
- MultiWOZ 2.0 (multiwoz)
- MetaLWOZ (metalwoz)
- KVRET (kvret)
- WOZ (woz)
- CamRest676 (camres676)
- Frames (frames)
- TaskMaster (taskmaster)
- Schema-Guided (schema)
- MSR-E2E (e2e_msr).
The detail and leaderboard of each dataset can be found in [TextBox page](https://github.com/RUCAIBox/TextBox#dataset). |
JovialValley/phoneme_totalMapped3 | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 109775968
num_examples: 390
- name: test
num_bytes: 27190896
num_examples: 97
download_size: 137863961
dataset_size: 136966864
---
# Dataset Card for "phoneme_totalMapped3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TmB89/us_dataset | ---
license: mit
---
|
joefox/LibriSpeech_test_noise | ---
license: apache-2.0
---
### Dataset Summary
Augmented part of the test data of the LibriSpeech dataset.
As a basis, the original part of the test was taken, and augmentation was carried out to add extraneous noise.
|
senhorsapo/kratos | ---
license: openrail
---
|
open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B | ---
pretty_name: Evaluation run of Walmart-the-bag/Influxient-4x13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Walmart-the-bag/Influxient-4x13B](https://huggingface.co/Walmart-the-bag/Influxient-4x13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T01:10:07.093239](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B/blob/main/results_2023-12-30T01-10-07.093239.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5727072313721517,\n\
\ \"acc_stderr\": 0.033466156465793005,\n \"acc_norm\": 0.5776499509226207,\n\
\ \"acc_norm_stderr\": 0.03415178949023358,\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405334,\n \"mc2\": 0.5410446803363212,\n\
\ \"mc2_stderr\": 0.0155300726933085\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6480780720971918,\n\
\ \"acc_stderr\": 0.004765937515197188,\n \"acc_norm\": 0.834196375224059,\n\
\ \"acc_norm_stderr\": 0.0037114419828661784\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983067,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552746,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946005,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971642,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971642\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625676,\n\
\ \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625676\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811945,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811945\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5898692810457516,\n \"acc_stderr\": 0.019898412717635906,\n \
\ \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.019898412717635906\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505418,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505418\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405334,\n \"mc2\": 0.5410446803363212,\n\
\ \"mc2_stderr\": 0.0155300726933085\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3305534495830174,\n \
\ \"acc_stderr\": 0.012957496367085026\n }\n}\n```"
repo_url: https://huggingface.co/Walmart-the-bag/Influxient-4x13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|arc:challenge|25_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|gsm8k|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hellaswag|10_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T01-10-07.093239.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- '**/details_harness|winogrande|5_2023-12-30T01-10-07.093239.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T01-10-07.093239.parquet'
- config_name: results
data_files:
- split: 2023_12_30T01_10_07.093239
path:
- results_2023-12-30T01-10-07.093239.parquet
- split: latest
path:
- results_2023-12-30T01-10-07.093239.parquet
---
# Dataset Card for Evaluation run of Walmart-the-bag/Influxient-4x13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/Influxient-4x13B](https://huggingface.co/Walmart-the-bag/Influxient-4x13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T01:10:07.093239](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B/blob/main/results_2023-12-30T01-10-07.093239.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5727072313721517,
"acc_stderr": 0.033466156465793005,
"acc_norm": 0.5776499509226207,
"acc_norm_stderr": 0.03415178949023358,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405334,
"mc2": 0.5410446803363212,
"mc2_stderr": 0.0155300726933085
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6480780720971918,
"acc_stderr": 0.004765937515197188,
"acc_norm": 0.834196375224059,
"acc_norm_stderr": 0.0037114419828661784
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983067,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552746,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946005,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971642,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971642
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625676,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811945,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811945
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.019898412717635906,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.019898412717635906
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505418,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505418
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405334,
"mc2": 0.5410446803363212,
"mc2_stderr": 0.0155300726933085
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.3305534495830174,
"acc_stderr": 0.012957496367085026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/mmarco_it_dev | ---
pretty_name: '`mmarco/it/dev`'
viewer: false
source_datasets: ['irds/mmarco_it']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/it/dev`
The `mmarco/it/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/it/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,093
- `qrels`: (relevance assessments); count=59,273
- For `docs`, use [`irds/mmarco_it`](https://huggingface.co/datasets/irds/mmarco_it)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_it_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_it_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
medmac01/qa_morocco_history_v1 | ---
task_categories:
- question-answering
language:
- fr
- en
tags:
- extractive_qa
size_categories:
- 1K<n<10K
---
|
SEIEZ/test2-pr-tr-1person | ---
license: mit
---
|
nguyenthanhdo/ultrachat-aem-alpaca-v1.0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 208601043
num_examples: 54411
download_size: 126826003
dataset_size: 208601043
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ultrachat-aem-alpaca-v1.0"
This dataset is a subset of the https://huggingface.co/datasets/stingning/ultrachat.
This dataset focuses on the question answering task on an existing context, using a simple keyword filter (any question containing one of these keywords: passage, article, context). I also extract only the first round of conversation and convert it to the familiar alpaca format, and further filter so that the dataset only contain long input (which means complex instruction imo).
Code for generate the dataset:
```py
from datasets import load_dataset
ultra = load_dataset(
"stingning/ultrachat",
data_files=[
"train_6.jsonl",
"train_7.jsonl",
"train_8.jsonl",
"train_9.jsonl"
],
split="train"
)
def get_first_turn(example):
data = example["data"]
instruction, output = data[0], data[1]
example.pop("data")
example["instruction"] = instruction
example["input"] = ''
example["output"] = output
return example
## Assistance on Existing Materials
def aem(example):
keywords = ["article", "context", "passage"]
data = example["data"]
first_instruction = data[0]
flag = False
if any([kw in first_instruction.lower() for kw in keywords]):
flag = True
return flag
ultra_aem = ultra.filter(aem)
ultra_aem_long = ultra_aem.filter(lambda x: len(x["data"][0].split()) > 200)
ultra_aem_first_turn = ultra_aem_long.map(get_first_turn)
ultra_aem_first_turn.push_to_hub("nguyenthanhdo/ultrachat-aem-alpaca-v1.0")
```
**TODO**
Intended use for this dataset was for closed question answering only. But ultrachat dataset also contains rewriting, translation and summarization tasks.
- Only keep the question answering task by further filtering, since currently this dataset still contains contamination because of samples for other tasks.
- Better filtering to seperate 4 tasks: question answering, rewriting, translation and summarization. |
Gabriel1322/jotase | ---
license: openrail
---
|
646e62/skca-2015 | ---
license: apache-2.0
---
|
jlbaker361/cyberpunk-1500-cropped | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: frame
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 36378730.0
num_examples: 167
download_size: 36372233
dataset_size: 36378730.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
plaguss/test-distiset-1 | ---
dataset_info:
- config_name: leaf_step_1
features:
- name: a
dtype: int64
splits:
- name: test
num_bytes: 8
num_examples: 1
- name: train
num_bytes: 24
num_examples: 3
download_size: 2485
dataset_size: 32.0
- config_name: leaf_step_2
features:
- name: a
dtype: int64
- name: b
dtype: int64
splits:
- name: test
num_bytes: 16
num_examples: 1
- name: train
num_bytes: 64
num_examples: 4
download_size: 3885
dataset_size: 80.0
configs:
- config_name: leaf_step_1
data_files:
- split: train
path: leaf_step_1/train-*
- split: test
path: leaf_step_1/test-*
- config_name: leaf_step_2
data_files:
- split: train
path: leaf_step_2/train-*
- split: test
path: leaf_step_2/test-*
---
|
anlp/sentence_w_elimination | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: sentences
sequence: string
- name: new_gt
sequence: string
splits:
- name: train
num_bytes: 1201528
num_examples: 990
download_size: 244599
dataset_size: 1201528
---
# Dataset Card for "sentence_w_elimination"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeffreyhuber/state_of_the_union | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39305
num_examples: 365
download_size: 25872
dataset_size: 39305
---
# Dataset Card for "state_of_the_union"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B | ---
pretty_name: Evaluation run of Sao10K/Medusa-1.1-L2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Medusa-1.1-L2-7B](https://huggingface.co/Sao10K/Medusa-1.1-L2-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T22:27:14.314386](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B/blob/main/results_2023-10-23T22-27-14.314386.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2837667785234899,\n\
\ \"em_stderr\": 0.004616870115379374,\n \"f1\": 0.3653198406040281,\n\
\ \"f1_stderr\": 0.004545820875148166,\n \"acc\": 0.3824984008238525,\n\
\ \"acc_stderr\": 0.007721122557033827\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2837667785234899,\n \"em_stderr\": 0.004616870115379374,\n\
\ \"f1\": 0.3653198406040281,\n \"f1_stderr\": 0.004545820875148166\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \
\ \"acc_stderr\": 0.003282055917136963\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.01216018919693069\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Medusa-1.1-L2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T22_27_14.314386
path:
- '**/details_harness|drop|3_2023-10-23T22-27-14.314386.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T22-27-14.314386.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T22_27_14.314386
path:
- '**/details_harness|gsm8k|5_2023-10-23T22-27-14.314386.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T22-27-14.314386.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-52-20.607338.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-52-20.607338.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T22_27_14.314386
path:
- '**/details_harness|winogrande|5_2023-10-23T22-27-14.314386.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T22-27-14.314386.parquet'
- config_name: results
data_files:
- split: 2023_09_12T09_52_20.607338
path:
- results_2023-09-12T09-52-20.607338.parquet
- split: 2023_10_23T22_27_14.314386
path:
- results_2023-10-23T22-27-14.314386.parquet
- split: latest
path:
- results_2023-10-23T22-27-14.314386.parquet
---
# Dataset Card for Evaluation run of Sao10K/Medusa-1.1-L2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Medusa-1.1-L2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Medusa-1.1-L2-7B](https://huggingface.co/Sao10K/Medusa-1.1-L2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T22:27:14.314386](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B/blob/main/results_2023-10-23T22-27-14.314386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2837667785234899,
"em_stderr": 0.004616870115379374,
"f1": 0.3653198406040281,
"f1_stderr": 0.004545820875148166,
"acc": 0.3824984008238525,
"acc_stderr": 0.007721122557033827
},
"harness|drop|3": {
"em": 0.2837667785234899,
"em_stderr": 0.004616870115379374,
"f1": 0.3653198406040281,
"f1_stderr": 0.004545820875148166
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.003282055917136963
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.01216018919693069
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
asas-ai/arabic_punctuation | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: dataset_name
dtype: string
- name: subset_name
dtype: string
- name: text_no_punc
dtype: string
splits:
- name: train
num_bytes: 7357785049
num_examples: 11738819
download_size: 3092363938
dataset_size: 7357785049
license: cc-by-4.0
---
# Dataset Card for "arabic_punctuation"
## Dataset Details
### Dataset Description
This is a curated dataset, specifically designed to facilitate the study of punctuation. It has undergone rigorous manual annotation and verification on the basis of sentence structure, with sentence boundaries clearly marked. The dataset is in three folders:
1. The ABC component of the Arabic Punctuation Dataset: This folder features the manually annotated punctuation gold standard. It consists of one chapter extracted from each of 45 non-fiction books by 36 authors from 19 different fields of study. It contains 45 text files with a total of 149K tokens in 13K sentences.
2. The CBT component: This folder has 1085 text files in 60 sub-folders, the full text of complete book translations that had been rendered from English into Arabic independently of this project. Their punctuation, we found out, mirrors the English source language texts; i.e., the sentence terminals in these Arabic texts follow the rules of English. In this folder are close to 3M words in more than 170K properly punctuated sentences.
3. The SSAC-UNPC component: This folder constitutes the third part of the Arabic Punctuation Dataset. It has close to 12M disconnected, disordered, complete sentences in 79 text files. These scrambled sentences were extracted from the predominantly legal Arabic subcorpus of the United Nations Parallel Corpus (UNPC). The punctuation here is authentic. It was done by the UN translators as part of their work. We consider this to be an excellent punctuation corpus because it mirrors the rule-governed punctuation of the English source documents, especially in relation to sentence terminals. These scrambled sentences total more than 309M words.
### Steps to reproduce
The ABC component was manually annotated and verified.
The CBT dataset was translated books extracted from an online library.
The SSAC-UNPC dataset was full sentences extracted from the Arabic component of the United Nations Parallel Corpus.
## Citation
```
@misc{Yagi_Ashraf Elnagar_2024,
url={https://data.mendeley.com/datasets/2pkxckwgs3/1},
journal={Arabic Punctuation Dataset},
publisher={Mendeley Data},
author={Yagi, Sane and Ashraf Elnagar},
year={2024},
month={Jan}}
``` |
awacke1/LOINC-CodeSet-Value-Description-Semantic-Set.csv | ---
license: mit
---
|
open-llm-leaderboard/details_Walmart-the-bag__Misted-v2-7B | ---
pretty_name: Evaluation run of Walmart-the-bag/Misted-v2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Walmart-the-bag/Misted-v2-7B](https://huggingface.co/Walmart-the-bag/Misted-v2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__Misted-v2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T19:55:55.941611](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Misted-v2-7B/blob/main/results_2024-04-15T19-55-55.941611.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6227572973505467,\n\
\ \"acc_stderr\": 0.0328535387231475,\n \"acc_norm\": 0.6265982802703474,\n\
\ \"acc_norm_stderr\": 0.033511186837411694,\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.01747451384852552,\n \"mc2\": 0.641300631795274,\n\
\ \"mc2_stderr\": 0.015320525733509071\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938163,\n\
\ \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.660426209918343,\n\
\ \"acc_stderr\": 0.004725967684806405,\n \"acc_norm\": 0.8528181637124079,\n\
\ \"acc_norm_stderr\": 0.0035356302890914492\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6161290322580645,\n \"acc_stderr\": 0.027666182075539635,\n \"\
acc_norm\": 0.6161290322580645,\n \"acc_norm_stderr\": 0.027666182075539635\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940788,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940788\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739152,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406978,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406978\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.01428337804429642,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.01428337804429642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.016392221899407075,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.016392221899407075\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.02623696588115327,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.02623696588115327\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.012725701656953642,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.012725701656953642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.01747451384852552,\n \"mc2\": 0.641300631795274,\n\
\ \"mc2_stderr\": 0.015320525733509071\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209406\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4761182714177407,\n \
\ \"acc_stderr\": 0.013756765835465755\n }\n}\n```"
repo_url: https://huggingface.co/Walmart-the-bag/Misted-v2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|arc:challenge|25_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|gsm8k|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hellaswag|10_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-55-55.941611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T19-55-55.941611.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- '**/details_harness|winogrande|5_2024-04-15T19-55-55.941611.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T19-55-55.941611.parquet'
- config_name: results
data_files:
- split: 2024_04_15T19_55_55.941611
path:
- results_2024-04-15T19-55-55.941611.parquet
- split: latest
path:
- results_2024-04-15T19-55-55.941611.parquet
---
# Dataset Card for Evaluation run of Walmart-the-bag/Misted-v2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/Misted-v2-7B](https://huggingface.co/Walmart-the-bag/Misted-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__Misted-v2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T19:55:55.941611](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Misted-v2-7B/blob/main/results_2024-04-15T19-55-55.941611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6227572973505467,
"acc_stderr": 0.0328535387231475,
"acc_norm": 0.6265982802703474,
"acc_norm_stderr": 0.033511186837411694,
"mc1": 0.47123623011015914,
"mc1_stderr": 0.01747451384852552,
"mc2": 0.641300631795274,
"mc2_stderr": 0.015320525733509071
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938163,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620453
},
"harness|hellaswag|10": {
"acc": 0.660426209918343,
"acc_stderr": 0.004725967684806405,
"acc_norm": 0.8528181637124079,
"acc_norm_stderr": 0.0035356302890914492
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539635,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940788,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940788
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739152,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406978,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406978
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.01428337804429642,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.01428337804429642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.016392221899407075,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.016392221899407075
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115327,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115327
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.012725701656953642,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.012725701656953642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47123623011015914,
"mc1_stderr": 0.01747451384852552,
"mc2": 0.641300631795274,
"mc2_stderr": 0.015320525733509071
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209406
},
"harness|gsm8k|5": {
"acc": 0.4761182714177407,
"acc_stderr": 0.013756765835465755
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hsong1101/news_summarization | ---
license: pddl
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4643521852
num_examples: 696389
- name: test
num_bytes: 1160885464
num_examples: 174098
download_size: 978222798
dataset_size: 5804407316
---
|
arresejo/llm-macron | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 991716.0
num_examples: 121
- name: test
num_bytes: 114744.0
num_examples: 14
download_size: 555903
dataset_size: 1106460.0
---
# Dataset Card for "llm-macron"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bitccty/scicorpus | ---
license: apache-2.0
---
|
tyzhu/squad_qa_no_id_v5_full | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7374223
num_examples: 5070
- name: validation
num_bytes: 342766
num_examples: 300
download_size: 1438089
dataset_size: 7716989
---
# Dataset Card for "squad_qa_no_id_v5_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Locutusque__LocutusqueXFelladrin-TinyMistral248M-Instruct | ---
pretty_name: Evaluation run of Locutusque/LocutusqueXFelladrin-TinyMistral248M-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/LocutusqueXFelladrin-TinyMistral248M-Instruct](https://huggingface.co/Locutusque/LocutusqueXFelladrin-TinyMistral248M-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__LocutusqueXFelladrin-TinyMistral248M-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T13:05:29.280274](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__LocutusqueXFelladrin-TinyMistral248M-Instruct/blob/main/results_2023-12-16T13-05-29.280274.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2598863864332701,\n\
\ \"acc_stderr\": 0.03085871471372819,\n \"acc_norm\": 0.2612223474549382,\n\
\ \"acc_norm_stderr\": 0.03168256031998437,\n \"mc1\": 0.204406364749082,\n\
\ \"mc1_stderr\": 0.014117174337432621,\n \"mc2\": 0.40124313581017795,\n\
\ \"mc2_stderr\": 0.01490869512458324\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.011681625756888676,\n\
\ \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.01261035266329267\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2757418840868353,\n\
\ \"acc_stderr\": 0.004459740315490862,\n \"acc_norm\": 0.2779326827325234,\n\
\ \"acc_norm_stderr\": 0.004470644845242891\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502652,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502652\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24193548387096775,\n \"acc_stderr\": 0.02436259969303109,\n \"\
acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.02436259969303109\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.37373737373737376,\n \"acc_stderr\": 0.034468977386593325,\n \"\
acc_norm\": 0.37373737373737376,\n \"acc_norm_stderr\": 0.034468977386593325\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.034697137917043715,\n\
\ \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.034697137917043715\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371218,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371218\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593613,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593613\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.30642201834862387,\n \"acc_stderr\": 0.019765517220458523,\n \"\
acc_norm\": 0.30642201834862387,\n \"acc_norm_stderr\": 0.019765517220458523\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.031546962856566295,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.031546962856566295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2109704641350211,\n \"acc_stderr\": 0.026558372502661923,\n \
\ \"acc_norm\": 0.2109704641350211,\n \"acc_norm_stderr\": 0.026558372502661923\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.22869955156950672,\n\
\ \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.22869955156950672,\n\
\ \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.11570247933884298,\n \"acc_stderr\": 0.0291998024556228,\n \"\
acc_norm\": 0.11570247933884298,\n \"acc_norm_stderr\": 0.0291998024556228\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2094017094017094,\n\
\ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.2094017094017094,\n\
\ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21966794380587484,\n\
\ \"acc_stderr\": 0.014805384478371176,\n \"acc_norm\": 0.21966794380587484,\n\
\ \"acc_norm_stderr\": 0.014805384478371176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.01473692638376196,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.01473692638376196\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113596,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113596\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953777,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178477,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178477\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.2636363636363636,\n \"acc_stderr\": 0.04220224692971987,\n\
\ \"acc_norm\": 0.2636363636363636,\n \"acc_norm_stderr\": 0.04220224692971987\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.028920583220675578,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.028920583220675578\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.03152439186555401,\n\
\ \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.03152439186555401\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.24096385542168675,\n \"acc_stderr\": 0.03329394119073529,\n\
\ \"acc_norm\": 0.24096385542168675,\n \"acc_norm_stderr\": 0.03329394119073529\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.030267457554898465,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.030267457554898465\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.204406364749082,\n \"mc1_stderr\": 0.014117174337432621,\n\
\ \"mc2\": 0.40124313581017795,\n \"mc2_stderr\": 0.01490869512458324\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.4909234411996843,\n\
\ \"acc_stderr\": 0.014050170094497704\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/LocutusqueXFelladrin-TinyMistral248M-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|arc:challenge|25_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|gsm8k|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hellaswag|10_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T13-05-29.280274.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T13-05-29.280274.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- '**/details_harness|winogrande|5_2023-12-16T13-05-29.280274.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T13-05-29.280274.parquet'
- config_name: results
data_files:
- split: 2023_12_16T13_05_29.280274
path:
- results_2023-12-16T13-05-29.280274.parquet
- split: latest
path:
- results_2023-12-16T13-05-29.280274.parquet
---
# Dataset Card for Evaluation run of Locutusque/LocutusqueXFelladrin-TinyMistral248M-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/LocutusqueXFelladrin-TinyMistral248M-Instruct](https://huggingface.co/Locutusque/LocutusqueXFelladrin-TinyMistral248M-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__LocutusqueXFelladrin-TinyMistral248M-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T13:05:29.280274](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__LocutusqueXFelladrin-TinyMistral248M-Instruct/blob/main/results_2023-12-16T13-05-29.280274.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2598863864332701,
"acc_stderr": 0.03085871471372819,
"acc_norm": 0.2612223474549382,
"acc_norm_stderr": 0.03168256031998437,
"mc1": 0.204406364749082,
"mc1_stderr": 0.014117174337432621,
"mc2": 0.40124313581017795,
"mc2_stderr": 0.01490869512458324
},
"harness|arc:challenge|25": {
"acc": 0.19965870307167236,
"acc_stderr": 0.011681625756888676,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.01261035266329267
},
"harness|hellaswag|10": {
"acc": 0.2757418840868353,
"acc_stderr": 0.004459740315490862,
"acc_norm": 0.2779326827325234,
"acc_norm_stderr": 0.004470644845242891
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502652,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.02436259969303109,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.02436259969303109
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.37373737373737376,
"acc_stderr": 0.034468977386593325,
"acc_norm": 0.37373737373737376,
"acc_norm_stderr": 0.034468977386593325
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.034697137917043715,
"acc_norm": 0.3626943005181347,
"acc_norm_stderr": 0.034697137917043715
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371218,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371218
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593613,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593613
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30642201834862387,
"acc_stderr": 0.019765517220458523,
"acc_norm": 0.30642201834862387,
"acc_norm_stderr": 0.019765517220458523
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.031546962856566295,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.031546962856566295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2109704641350211,
"acc_stderr": 0.026558372502661923,
"acc_norm": 0.2109704641350211,
"acc_norm_stderr": 0.026558372502661923
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.22869955156950672,
"acc_stderr": 0.028188240046929196,
"acc_norm": 0.22869955156950672,
"acc_norm_stderr": 0.028188240046929196
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.11570247933884298,
"acc_stderr": 0.0291998024556228,
"acc_norm": 0.11570247933884298,
"acc_norm_stderr": 0.0291998024556228
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2094017094017094,
"acc_stderr": 0.026655699653922737,
"acc_norm": 0.2094017094017094,
"acc_norm_stderr": 0.026655699653922737
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21966794380587484,
"acc_stderr": 0.014805384478371176,
"acc_norm": 0.21966794380587484,
"acc_norm_stderr": 0.014805384478371176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.01473692638376196,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.01473692638376196
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113596,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113596
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953777,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178477,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178477
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675578,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675578
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.03329394119073529,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.03329394119073529
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.030267457554898465,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.030267457554898465
},
"harness|truthfulqa:mc|0": {
"mc1": 0.204406364749082,
"mc1_stderr": 0.014117174337432621,
"mc2": 0.40124313581017795,
"mc2_stderr": 0.01490869512458324
},
"harness|winogrande|5": {
"acc": 0.4909234411996843,
"acc_stderr": 0.014050170094497704
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/squad_qa_wrong_num_v5_full_random_permute_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 6558153.253911806
num_examples: 4345
- name: validation
num_bytes: 346484
num_examples: 300
download_size: 1363986
dataset_size: 6904637.253911806
---
# Dataset Card for "squad_qa_wrong_num_v5_full_random_permute_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BuyKlonopin/BuyKlonopinOnline | ---
license: bigscience-openrail-m
---
|
LahiruLowe/falcon-40b-sft-top1-560_niv2_explanation_targets | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 1111
num_examples: 1
download_size: 10191
dataset_size: 1111
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "falcon-40b-sft-top1-560_niv2_explanation_targets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft | ---
pretty_name: Evaluation run of Yukang/Llama-2-7b-longlora-100k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-7b-longlora-100k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T15:58:28.063022](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft/blob/main/results_2023-12-03T15-58-28.063022.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T18_35_01.826306
path:
- '**/details_harness|drop|3_2023-10-24T18-35-01.826306.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T18-35-01.826306.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T18_35_01.826306
path:
- '**/details_harness|gsm8k|5_2023-10-24T18-35-01.826306.parquet'
- split: 2023_12_03T15_58_28.063022
path:
- '**/details_harness|gsm8k|5_2023-12-03T15-58-28.063022.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T15-58-28.063022.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T18_35_01.826306
path:
- '**/details_harness|winogrande|5_2023-10-24T18-35-01.826306.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T18-35-01.826306.parquet'
- config_name: results
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- results_2023-10-03T23-44-33.008703.parquet
- split: 2023_10_24T18_35_01.826306
path:
- results_2023-10-24T18-35-01.826306.parquet
- split: 2023_12_03T15_58_28.063022
path:
- results_2023-12-03T15-58-28.063022.parquet
- split: latest
path:
- results_2023-12-03T15-58-28.063022.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-7b-longlora-100k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-7b-longlora-100k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T15:58:28.063022](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft/blob/main/results_2023-12-03T15-58-28.063022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sudiptabasak/expressions-vectors | ---
license: mit
---
|
noza-kit/wmt23_enjp_train_jppt_ex1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: en
dtype: string
- name: jp
dtype: string
splits:
- name: train
num_bytes: 9937341
num_examples: 41844
download_size: 4632720
dataset_size: 9937341
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/cattleya_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cattleya/カトレア (Pokémon)
This is the dataset of cattleya/カトレア (Pokémon), containing 367 images and their tags.
The core tags of this character are `long_hair, blonde_hair, hat, very_long_hair, breasts, green_eyes, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 367 | 314.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 367 | 206.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 690 | 368.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 367 | 287.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 690 | 487.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cattleya_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, nude, solo, blush, navel, nipples, small_breasts, pussy, smile |
| 1 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, sex, solo_focus, vaginal, censored, nipples, penis, spread_legs, nude, open_mouth, small_breasts, smile, cum_in_pussy, navel, on_back |
| 2 | 7 |  |  |  |  |  | 1girl, blush, hetero, nipples, nude, penis, solo_focus, 1boy, small_breasts, fellatio, medium_breasts, uncensored |
| 3 | 7 |  |  |  |  |  | 1boy, 1girl, handjob, hetero, penis, solo_focus, bar_censor, blush, nipples, nude, pointless_censoring, smile |
| 4 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, mosaic_censoring, nipples, paizuri, penis, solo_focus, cum_on_breasts, looking_at_viewer, on_back, open_mouth, heart, huge_breasts, pov, white_headwear, ;o, bare_shoulders, large_breasts, one_eye_closed, parted_bangs, shirt, speech_bubble |
| 5 | 5 |  |  |  |  |  | 1girl, dress, solo |
| 6 | 14 |  |  |  |  |  | 1girl, long_sleeves, parted_bangs, eyelashes, looking_at_viewer, closed_mouth, pink_footwear, shoes, pink_headwear, collarbone, pink_dress, pokemon_(creature), solo, full_body, white_headwear, sitting |
| 7 | 6 |  |  |  |  |  | 1girl, dress, long_sleeves, looking_at_viewer, solo, eyelashes, parted_bangs, hand_up, open_mouth, pink_headwear, aqua_eyes |
| 8 | 6 |  |  |  |  |  | 1girl, black_dress, hair_ornament, looking_at_viewer, official_alternate_costume, sidelocks, blush, detached_sleeves, parted_bangs, pokemon_(creature), ponytail, black_choker, closed_mouth, eyelashes, pantyhose, bare_shoulders, red_gemstone |
| 9 | 6 |  |  |  |  |  | 1girl, pokemon_(creature), lying, sleeping, brown_hair, closed_eyes, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | nude | solo | blush | navel | nipples | small_breasts | pussy | smile | 1boy | hetero | sex | solo_focus | vaginal | censored | penis | spread_legs | open_mouth | cum_in_pussy | on_back | fellatio | medium_breasts | uncensored | handjob | bar_censor | pointless_censoring | mosaic_censoring | paizuri | cum_on_breasts | looking_at_viewer | heart | huge_breasts | pov | white_headwear | ;o | bare_shoulders | large_breasts | one_eye_closed | parted_bangs | shirt | speech_bubble | dress | long_sleeves | eyelashes | closed_mouth | pink_footwear | shoes | pink_headwear | collarbone | pink_dress | pokemon_(creature) | full_body | sitting | hand_up | aqua_eyes | black_dress | hair_ornament | official_alternate_costume | sidelocks | detached_sleeves | ponytail | black_choker | pantyhose | red_gemstone | lying | sleeping | brown_hair | closed_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:--------|:--------|:----------|:----------------|:--------|:--------|:-------|:---------|:------|:-------------|:----------|:-----------|:--------|:--------------|:-------------|:---------------|:----------|:-----------|:-----------------|:-------------|:----------|:-------------|:----------------------|:-------------------|:----------|:-----------------|:--------------------|:--------|:---------------|:------|:-----------------|:-----|:-----------------|:----------------|:-----------------|:---------------|:--------|:----------------|:--------|:---------------|:------------|:---------------|:----------------|:--------|:----------------|:-------------|:-------------|:---------------------|:------------|:----------|:----------|:------------|:--------------|:----------------|:-----------------------------|:------------|:-------------------|:-----------|:---------------|:------------|:---------------|:--------|:-----------|:-------------|:--------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | | X | X | | | X | X | | X | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | X | | | X | X | X | | X | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | X | | X | | | | X | X | | X | | | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 14 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | X | | | X | X | X | | | | X | | | | | | X | X | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | X | | | | | X | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | |
| 9 | 6 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X |
|
AdapterOcean/data-standardized_cluster_9 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 41042768
num_examples: 3814
download_size: 11939810
dataset_size: 41042768
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LauraRuis/tom_rlhf | ---
license: mit
task_categories:
- text-generation
pretty_name: tom
size_categories:
- 10K<n<100K
--- |
torileatherman/sentiment_analysis_batch_predictions | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Weyaxi__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp | ---
pretty_name: Evaluation run of Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T05:11:37.271243](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2024-01-08T05-11-37.271243.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6464533975388377,\n\
\ \"acc_stderr\": 0.032163810731246786,\n \"acc_norm\": 0.6464814911400231,\n\
\ \"acc_norm_stderr\": 0.03282564461917708,\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5513669244614883,\n\
\ \"mc2_stderr\": 0.015335304188531462\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893449,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756562\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6642103166699861,\n\
\ \"acc_stderr\": 0.004713006072807707,\n \"acc_norm\": 0.8537143995220076,\n\
\ \"acc_norm_stderr\": 0.0035267007418794435\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5513669244614883,\n\
\ \"mc2_stderr\": 0.015335304188531462\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881573\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.012493927348659629\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|arc:challenge|25_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|gsm8k|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hellaswag|10_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-11-37.271243.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T05-11-37.271243.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- '**/details_harness|winogrande|5_2024-01-08T05-11-37.271243.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T05-11-37.271243.parquet'
- config_name: results
data_files:
- split: 2024_01_08T05_11_37.271243
path:
- results_2024-01-08T05-11-37.271243.parquet
- split: latest
path:
- results_2024-01-08T05-11-37.271243.parquet
---
# Dataset Card for Evaluation run of Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/Weyaxi/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T05:11:37.271243](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2024-01-08T05-11-37.271243.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6464533975388377,
"acc_stderr": 0.032163810731246786,
"acc_norm": 0.6464814911400231,
"acc_norm_stderr": 0.03282564461917708,
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5513669244614883,
"mc2_stderr": 0.015335304188531462
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893449,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756562
},
"harness|hellaswag|10": {
"acc": 0.6642103166699861,
"acc_stderr": 0.004713006072807707,
"acc_norm": 0.8537143995220076,
"acc_norm_stderr": 0.0035267007418794435
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5513669244614883,
"mc2_stderr": 0.015335304188531462
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881573
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.012493927348659629
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TurkuNLP/jigsaw_toxicity_pred_fi | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
task_ids:
- multi-label-classification
language:
- fi
multilinguality:
- translation
tags:
- toxicity, multi-label
source_datasets:
- extended|jigsaw_toxicity_pred
size_categories:
- 100K<n<1M
---
### Dataset Summary
This dataset is a DeepL -based machine translated version of the Jigsaw toxicity dataset for Finnish. The dataset is originally from a Kaggle competition https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/data.
The dataset poses a multi-label text classification problem and includes the labels `identity_attack`, `insult`, `obscene`, `severe_toxicity`, `threat` and `toxicity`.
#### Example data
```
{
"label_identity_attack": 0,
"label_insult": 0,
"label_obscene": 0,
"label_severe_toxicity": 0,
"label_threat": 0,
"label_toxicity": 0,
"lang": "fi-deepl",
"text": "\" \n\n Hei Pieter Pietersen, ja tervetuloa Wikipediaan! \n\n Tervetuloa Wikipediaan! Toivottavasti viihdyt tietosanakirjassa ja haluat jäädä tänne. Ensimmäiseksi voit lukea johdannon. \n\n Jos sinulla on kysyttävää, voit kysyä minulta keskustelusivullani - autan mielelläni. Tai voit kysyä kysymyksesi Uusien avustajien ohjesivulla. \n\n - \n Seuraavassa on lisää resursseja, jotka auttavat sinua tutkimaan ja osallistumaan maailman suurinta tietosanakirjaa.... \n\n Löydät perille: \n\n \n * Sisällysluettelo \n\n * Osastohakemisto \n\n \n Tarvitsetko apua? \n\n \n * Kysymykset - opas siitä, mistä voi esittää kysymyksiä. \n * Huijausluettelo - pikaohje Wikipedian merkintäkoodeista. \n\n * Wikipedian 5 pilaria - yleiskatsaus Wikipedian perustaan. \n * The Simplified Ruleset - yhteenveto Wikipedian tärkeimmistä säännöistä. \n\n \n Miten voit auttaa: \n\n \n * Wikipedian avustaminen - opas siitä, miten voit auttaa. \n\n * Yhteisöportaali - Wikipedian toiminnan keskus. \n\n \n Lisää vinkkejä... \n\n \n * Allekirjoita viestisi keskustelusivuilla neljällä tildillä (~~~~). Tämä lisää automaattisesti \"\"allekirjoituksesi\"\" (käyttäjänimesi ja päivämääräleima). Myös Wikipedian tekstinmuokkausikkunan yläpuolella olevassa työkalupalkissa oleva painike tekee tämän. \n\n * Jos haluat leikkiä uusilla Wiki-taidoillasi, Hiekkalaatikko on sinua varten. \n\n \n Onnea ja hauskaa. \""
}
```
### Data Fields
Fields marked as `label_` have either `0` to convey *not* having that category of toxicity in the text and `1` to convey having that category of toxicity present in the text.
- `label_identity_attack`: a `int64` feature.
- `label_insult`: a `int64` feature.
- `label_obscene`: a `int64` feature.
- `label_severe_toxicity`: a `int64` feature.
- `label_threat`: a `int64` feature.
- `label_toxicity`: a `int64` feature.
- `lang`: a `string` feature.
- `text`: a `string` feature.
### Data Splits
The splits are the same as in the original English data.
| dataset | train | test |
| -------- | -----: | ---------: |
| TurkuNLP/jigsaw_toxicity_pred_fi| 159571 | 63978 |
### Evaluation Results
Results from fine-tuning [TurkuNLP/bert-large-finnish-cased-v1](https://huggingface.co/TurkuNLP/bert-large-finnish-cased-v1) for multi-label toxicity detection. The fine-tuned model can be found
| dataset | F1-micro | Precision | Recall |
| -------------------- | ----: | ---: | ----: |
| TurkuNLP/jigsaw_toxicity_pred_fi | 0.66 | 0.58 | 0.76 |
<!--- Base results from fine-tuning [bert-large-cased](https://huggingface.co/bert-large-cased) on the original English data for multi-label toxicity detection.
| dataset | F1-micro | Precision | Recall |
| -------------------- | ----: | ---: | ----: |
| jigsaw_toxicity_pred | 0.69 | 0.59 | 0.81 | --->
### Considerations for Using the Data
Due to DeepL terms and conditions, this dataset **must not be used for any machine translation work**, namely machine translation
system development and evaluation of any kind. In general, we wish you do not pair the original English data with the translations
except when working on research unrelated to machine translation, so as not to infringe on the terms and conditions.
### Licensing Information
Contents of this repository are distributed under the
[Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0)](https://creativecommons.org/licenses/by-sa/4.0/).
Copyright of the dataset contents belongs to the original copyright holders.
### Citing
To cite this dataset use the following bibtex.
```
@inproceedings{eskelinen-etal-2023-toxicity,
title = "Toxicity Detection in {F}innish Using Machine Translation",
author = "Eskelinen, Anni and
Silvala, Laura and
Ginter, Filip and
Pyysalo, Sampo and
Laippala, Veronika",
booktitle = "Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)",
month = may,
year = "2023",
address = "T{\'o}rshavn, Faroe Islands",
publisher = "University of Tartu Library",
url = "https://aclanthology.org/2023.nodalida-1.68",
pages = "685--697",
abstract = "Due to the popularity of social media platforms and the sheer amount of user-generated content online, the automatic detection of toxic language has become crucial in the creation of a friendly and safe digital space. Previous work has been mostly focusing on English leaving many lower-resource languages behind. In this paper, we present novel resources for toxicity detection in Finnish by introducing two new datasets, a machine translated toxicity dataset for Finnish based on the widely used English Jigsaw dataset and a smaller test set of Suomi24 discussion forum comments originally written in Finnish and manually annotated following the definitions of the labels that were used to annotate the Jigsaw dataset. We show that machine translating the training data to Finnish provides better toxicity detection results than using the original English training data and zero-shot cross-lingual transfer with XLM-R, even with our newly annotated dataset from Suomi24.",
}
``` |
nlplabtdtu/multi-choices-text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: question
dtype: string
- name: options
list:
- name: answer
dtype: string
- name: key
dtype: string
- name: answer
struct:
- name: answer
dtype: string
- name: key
dtype: string
- name: solution
dtype: string
- name: type
dtype: string
- name: alnum_start
dtype: bool
- name: prompt
dtype: string
- name: response
dtype: string
- name: grade
dtype: string
- name: subject
dtype: string
- name: prompt_type
dtype: string
splits:
- name: train
num_bytes: 93596608
num_examples: 58286
download_size: 48223987
dataset_size: 93596608
---
# Dataset Card for "multi-choices-text"
Bộ dữ liệu trắc nghiệm gồm 58,290 dòng từ vungoi. Bộ này có một số đặc điểm sau:
```
- Các câu hỏi đều là câu hỏi hoàn chỉnh với "?" cuối câu
- Các câu hỏi tiếng Anh đều đã bị bỏ qua
- Các phần "Đáp án.*[ABCD]" của field "solution" bị thay bằng ""
- Đã bỏ "." ở từng "answer" của "options" và cả "solution". Chủ yếu để dễ làm prompt.
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Egbertjing/arxiv_title | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 11668790.166629568
num_examples: 143763
- name: test
num_bytes: 2917217.8333704313
num_examples: 35941
download_size: 9179829
dataset_size: 14586008.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
hippocrates/medicationqa_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 443458
num_examples: 690
download_size: 206863
dataset_size: 443458
---
# Dataset Card for "medicationqa_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vumichien/preprocessed_jsut_jsss_css10_common_voice_11 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 10432449169
num_examples: 29150
- name: test
num_bytes: 1562198132
num_examples: 4604
download_size: 12008358604
dataset_size: 11994647301
---
# Dataset Card for "preprocessed_jsut_jsss_css10_common_voice_11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_18 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 32923660
num_examples: 3255
download_size: 7744028
dataset_size: 32923660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_18"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Default-Box/recipe_nlg-trim | ---
language:
- en
size_categories:
- 1M<n<10M
viewer: true
--- |
bigscience-data/roots_zh_wikiquote | ---
language: zh
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_zh_wikiquote
# wikiquote_filtered
- Dataset uid: `wikiquote_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0462 % of total
- 0.1697 % of en
- 0.0326 % of fr
- 0.0216 % of ar
- 0.0066 % of zh
- 0.0833 % of pt
- 0.0357 % of es
- 0.0783 % of indic-ta
- 0.0361 % of indic-hi
- 0.0518 % of ca
- 0.0405 % of vi
- 0.0834 % of indic-ml
- 0.0542 % of indic-te
- 0.1172 % of indic-gu
- 0.0634 % of indic-kn
- 0.0539 % of id
- 0.0454 % of indic-ur
- 0.0337 % of indic-mr
- 0.0347 % of eu
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-gu
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-kn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
|
fewfsgrf/4modelsagain | ---
license: unknown
---
|
purav/animals | ---
license: mit
---
|
rinabuoy/Khmer-ALT-Flores-GTran-SSBIC-Reverse | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 30262816
num_examples: 75292
- name: test
num_bytes: 2666317
num_examples: 5911
download_size: 12144534
dataset_size: 32929133
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Amazetl/BattyBirdNET-Bavaria-256kHz-100 | ---
license: cc-by-nc-sa-4.0
tags:
- audio classification
- biology
- bat
- biomonitoring
- acoustics
---
A set of bat calls sampled at 256kHz or higher. European bat species.
Up to 100 random samples (if exist) from data assembled under same license from chiro-vox, animal sound library berlin, xeno-canto and individuals (R. Zinck and K. Richards).
https://github.com/rdz-oss/BattyBirdNET-Analyzer
```text
@misc{Zinck2023,
author = {Zinck, R.D.},
title = {BattyBirdNET - Bat Sound Analyzer},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/rdz-oss/BattyBirdNET-Analyzer }}
}
```
Sample files per species:

# References
## Papers
FROMMOLT, KARL-HEINZ. "The archive of animal sounds at the Humboldt-University of Berlin." Bioacoustics 6.4 (1996): 293-296.
Görföl, Tamás, et al. "ChiroVox: a public library of bat calls." PeerJ 10 (2022): e12445.
Gotthold, B., Khalighifar, A., Straw, B.R., and Reichert, B.E., 2022, Training dataset for NABat Machine Learning V1.0: U.S. Geological Survey data release, https://doi.org/10.5066/P969TX8F.
Kahl, Stefan, et al. "BirdNET: A deep learning solution for avian diversity monitoring." Ecological Informatics 61 (2021): 101236.
Vellinga, Willem-Pier, et al. "www. xeno-canto. org: a decade on."
## Links
https://www.museumfuernaturkunde.berlin/en/science/animal-sound-archive
https://www.chirovox.org/
https://www.sciencebase.gov/catalog/item/627ed4b2d34e3bef0c9a2f30
https://github.com/kahst/BirdNET-Analyzer
https://xeno-canto.org/ |
JetBrains-Research/lca-code-editing | ---
dataset_info:
- config_name: commitchronicle-py-long
features:
- name: hash
dtype: string
- name: repo
dtype: string
- name: date
dtype: string
- name: license
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
splits:
- name: test
num_examples: 119
- config_name: commitchronicle-py-long-labels
features:
- name: hash
dtype: string
- name: repo
dtype: string
- name: date
dtype: string
- name: license
dtype: string
- name: message
dtype: string
- name: label
dtype: int8
- name: comment
dtype: string
splits:
- name: test
num_bytes: 263065
num_examples: 858
download_size: 150455
dataset_size: 263065
configs:
- config_name: commitchronicle-py-long
data_files:
- split: test
path: commitchronicle-py-long/test-*
- config_name: commitchronicle-py-long-labels
data_files:
- split: test
path: commitchronicle-py-long-labels/test-*
---
# 🏟️ Long Code Arena (Code Editing)
This is the benchmark for Code Editing task as part of
🏟️ [Long Code Arena benchmark](https://huggingface.co/spaces/JetBrains-Research/long-code-arena).
## How-to
1. List all the available configs
via [`datasets.get_dataset_config_names`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.get_dataset_config_names)
and choose an appropriate one.
Current configs: `commitchronicle-py-long`, `commitchronicle-py-long-labels`
2. Load the data
via [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.load_dataset):
```
from datasets import load_dataset
configuration = "TODO" # select a configuration
dataset = load_dataset("JetBrains-Research/lca-code-editing", configuration, split="test")
```
Note that all the data we have is considered to be in the test split.
**Note 1.** Working with git repositories under [`repos`](https://huggingface.co/datasets/JetBrains-Research/lca-code-editing/tree/main/repos) directory is not supported via 🤗 Datasets. Download and extract the contents of each repository manually. We provide a full list of files in [`paths.json`](https://huggingface.co/datasets/JetBrains-Research/lca-code-editing/blob/main/paths.json).
**Note 2.** Working with vector stores under `vector_store` directory is not supported via 🤗 Datasets. Download the data for each repository manually. We provide a full list of files in [`paths.json`](https://huggingface.co/datasets/JetBrains-Research/lca-code-editing/blob/main/paths.json).
## Dataset Structure
This dataset contains three kinds of data:
* *full data* about each commit (including modifications)
* metadata with quality *labels*
* compressed *git repositories*
* precalculated [faiss](https://github.com/facebookresearch/faiss) *vector store* for each datapoint
### Full data
This section concerns configuration with *full data* about each commit (no `-labels` suffix).
Each example has the following fields:
| **Field** | **Description** |
|:---------:|:-----------------------------------------:|
| `repo` | Commit repository. |
| `hash` | Commit hash. |
| `date` | Commit date. |
| `license` | Commit repository's license. |
| `message` | Commit message. |
| `mods` | List of file modifications from a commit. |
Each file modification has the following fields:
| **Field** | **Description** |
|:-------------:|:-------------------------------------------------------------------------------------------------:|
| `change_type` | Type of change to current file. One of: `ADD`, `COPY`, `RENAME`, `DELETE`, `MODIFY` or `UNKNOWN`. |
| `old_path` | Path to file before change (might be empty). |
| `new_path` | Path to file after change (might be empty). |
| `diff` | `git diff` for current file. |
Data point example:
```
{'hash': 'f6347ae47c872b40339d9565a9cb29da5bca8716',
'repo': 'mycroftai/mycroft-core',
'date': None,
'license': None,
'message': 'Replace hashed meta with skill_gid as identifier\nThis also removes the notion of an owner skill and all skills may update settings on the server.',
'mods': [{'change_type': 'MODIFY',
'new_path': 'mycroft/skills/settings.py',
'old_path': 'mycroft/skills/settings.py',
'diff': '@@ -216,32 +216,10 @@ class SkillSettings(dict):<...>'}]}
```
### Labels
This section concerns configuration with metadata and *labels* (with `-labels` suffix).
Each example has the following fields:
| **Field** | **Description** |
|:---------:|:------------------------------------------------------------------:|
| `repo` | Commit repository. |
| `hash` | Commit hash. |
| `date` | Commit date. |
| `license` | Commit repository's license. |
| `message` | Commit message. |
| `label` | Label of current commit as a target for code editing task. |
| `comment` | Comment for a label for current commit (optional, might be empty). |
Labels are in 1-5 scale, where:
* 1 – strong no
* 2 – weak no
* 3 – unsure
* 4 – weak yes
* 5 – strong yes
Data point example:
```
{'hash': 'b9747bc011e9e9830ab147327d7aeaa8447ad2d7',
'repo': 'apache/libcloud',
'date': '20.02.2020 00:11:58',
'license': 'Apache License 2.0',
'message': 'Add new storage API methods for downloading part of an object (range\ndownload) and implement it for the S3 and local storage drivers.',
'label': 4.0,
'comment': 'might be an interesting use-case (and also quite complicated)'}
```
### Git Repositories
This section concerns [`repos`](https://huggingface.co/datasets/JetBrains-Research/lca-code-editing/tree/main/repos)
directory, which stores compressed Git repositories for all the commits in this benchmark. After you download and
extract it, you can work with each repository either via Git or via Python libraries
like [GitPython](https://github.com/gitpython-developers/GitPython)
or [PyDriller](https://github.com/ishepard/pydriller).
### Vector stores
This section concerns [`vector_store`](https://huggingface.co/datasets/JetBrains-Research/lca-code-editing/tree/main/vector_store) directory, which stores precalculated faiss vector stores for code retrieval. After you download them, you can work with the databases in the following way:
```python
from langchain.indexes import SQLRecordManager, index
from langchain_community.vectorstores.faiss import FAISS
from langchain_openai import OpenAIEmbeddings
# Namespace for the base commit
namespace = "apache__libcloud__9a7c47b31d513fc262fb1e5537f15d2335df3279"
# Setup the langchain vectorstore
embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
db = FAISS.load_local("vector_store", embeddings, index_name=namespace)
# Retrieve closest documents
new_docs = db.similarity_search("main", 3)
# Indexing. See: https://python.langchain.com/docs/modules/data_connection/indexing
record_manager_path = f"vector_store/{namespace}.sqlite"
record_manager = SQLRecordManager(namespace, db_url=f"sqlite:///{record_manager_path}")
# Update the vector store
index([new_docs], record_manager, db, cleanup=None)
```
|
Torando/medical-mistral | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
intfloat/personalized_passkey_retrieval | ---
license: apache-2.0
language:
- en
size_categories:
- n<1K
---
### Dataset Summary
This dataset contains the data for personalized passkey retrieval task in the paper [Improving Text Embeddings with Large Language Models](https://arxiv.org/pdf/2401.00368.pdf).
### Data Fields
- `query`: a `string` feature.
- `candidates`: List of `string` feature, 100 candidates for each query.
- `label`: a `int32` feature, the index of the correct candidate in the candidates list, always 0.
- `context_length`: a `int32` feature, the approximate length for the candidate documents.
### How to use this dataset
You can load the dataset in your python code as follows:
```python
from datasets import load_dataset
dataset = load_dataset("intfloat/personalized_passkey_retrieval")
```
The data in this repo is generated by the script [generate_passkey_data.py](https://huggingface.co/datasets/intfloat/personalized_passkey_retrieval/blob/main/generate_passkey_data.py).
You can also tweak the script to generate your own data.
### Citation Information
If you use this dataset in your research, please cite this paper:
```
@inproceedings{Wang2023ImprovingTE,
title={Improving Text Embeddings with Large Language Models},
author={Liang Wang and Nan Yang and Xiaolong Huang and Linjun Yang and Rangan Majumder and Furu Wei},
year={2023},
}
``` |
Coldog2333/blurb-pubmedqa | ---
license: apache-2.0
---
|
HydraLM/instruct-python-500k-standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 1010030074
num_examples: 1002698
download_size: 529792228
dataset_size: 1010030074
---
# Dataset Card for "instruct-python-500k-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
d0rj/HC3-ru | ---
task_categories:
- text-classification
- question-answering
- sentence-similarity
- zero-shot-classification
language_creators:
- translated
language:
- ru
multilinguality:
- monolingual
tags:
- ChatGPT
- SimpleAI
- Detection
- OOD
size_categories:
- 10K<n<100K
license: cc-by-sa-4.0
pretty_name: HC3 (ru)
source_datasets:
- Hello-SimpleAI/HC3
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: human_answers
sequence: string
- name: chatgpt_answers
sequence: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 135406074.0
num_examples: 24322
download_size: 62739799
dataset_size: 135406074.0
---
# Dataset Card for "HC3-ru"
This is translated version of [Hello-SimpleAI/HC3 dataset](https://huggingface.co/datasets/Hello-SimpleAI/HC3) into Russian.
## Citation
Checkout this papaer [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597)
```
@article{guo-etal-2023-hc3,
title = "How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection",
author = "Guo, Biyang and
Zhang, Xin and
Wang, Ziyuan and
Jiang, Minqi and
Nie, Jinran and
Ding, Yuxuan and
Yue, Jianwei and
Wu, Yupeng",
journal={arXiv preprint arxiv:2301.07597}
year = "2023",
}
``` |
Atipico1/NQ_train_preprocessed_with_so_case | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 614143977
num_examples: 87925
download_size: 333895121
dataset_size: 614143977
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LahiruLowe/t0_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 9821
num_examples: 5
download_size: 26143
dataset_size: 9821
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "t0_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuan-sf63/chenyu_label_0.2_32 | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
splits:
- name: train
num_bytes: 12265067.11608652
num_examples: 36740
- name: validation
num_bytes: 1363044.8839134802
num_examples: 4083
download_size: 0
dataset_size: 13628112.0
---
# Dataset Card for "chenyu_label_0.2_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sander-wood/wikimusictext | ---
license: mit
task_categories:
- text-classification
- text2text-generation
pretty_name: wikimt
size_categories:
- 1K<n<10K
language:
- en
tags:
- music
---
## Dataset Summary
In [CLaMP: Contrastive Language-Music Pre-training for Cross-Modal Symbolic Music Information Retrieval](https://ai-muzic.github.io/clamp/), we introduce WikiMusicText (WikiMT), a new dataset for the evaluation of semantic search and music classification. It includes 1010 lead sheets in ABC notation sourced from Wikifonia.org, each accompanied by a title, artist, genre, and description. The title and artist information is extracted from the score, whereas the genre labels are obtained by matching keywords from the Wikipedia entries and assigned to one of the 8 classes (Jazz, Country, Folk, R&B, Pop, Rock, Dance, and Latin) that loosely mimic the GTZAN genres. The description is obtained by utilizing BART-large to summarize and clean the corresponding Wikipedia entry. Additionally, the natural language information within the ABC notation is removed.
WikiMT is a unique resource to support the evaluation of semantic search and music classification. However, it is important to acknowledge that the dataset was curated from publicly available sources, and there may be limitations concerning the accuracy and completeness of the genre and description information. Further research is needed to explore the potential biases and limitations of the dataset and to develop strategies to address them.
## How to Access Music Score Metadata for ABC Notation
To access metadata related to ABC notation music scores from the WikiMT dataset, follow these steps:
1. **Locate the Wikifonia MusicXML Data Link:** Start by visiting the discussion thread on the forum to find the download link for the Wikifonia dataset in MusicXML format (with a .mxl extension). You can find the discussion here: [Download for Wikifonia all 6,675 Lead Sheets](http://www.synthzone.com/forum/ubbthreads.php/topics/384909/Download_for_Wikifonia_all_6,6).
2. **Run the Provided Code:** Once you have found the Wikifonia MusicXML data link, execute the provided Python code below. This code will handle the following tasks:
- Automatically download the "wikimusictext.jsonl" dataset, which contains metadata associated with music scores.
- Automatically download the "xml2abc.py" conversion script, with special thanks to the author, Willem (Wim).
- Prompt you for the Wikifonia data URL, as follows:
```python
Enter the Wikifonia URL: [Paste your URL here]
```
Paste the URL pointing to the Wikifonia.zip file and press Enter.
The below code will take care of downloading, processing, and extracting the music score metadata, making it ready for your research or applications.
```python
import subprocess
import os
import json
import zipfile
import io
# Install the required packages if they are not installed
try:
from unidecode import unidecode
except ImportError:
subprocess.check_call(["python", '-m', 'pip', 'install', 'unidecode'])
from unidecode import unidecode
try:
from tqdm import tqdm
except ImportError:
subprocess.check_call(["python", '-m', 'pip', 'install', 'tqdm'])
from tqdm import tqdm
try:
import requests
except ImportError:
subprocess.check_call(["python", '-m', 'pip', 'install', 'requests'])
import requests
def filter(lines):
# Filter out all lines that include language information
music = ""
for line in lines:
if line[:2] in ['A:', 'B:', 'C:', 'D:', 'F:', 'G', 'H:', 'I:', 'N:', 'O:', 'R:', 'r:', 'S:', 'T:', 'W:', 'w:', 'X:', 'Z:'] \
or line=='\n' \
or (line.startswith('%') and not line.startswith('%%score')):
continue
else:
if "%" in line and not line.startswith('%%score'):
line = "%".join(line.split('%')[:-1])
music += line[:-1] + '\n'
else:
music += line + '\n'
return music
def load_music(filename):
# Convert the file to ABC notation
p = subprocess.Popen(
f'python xml2abc_145/xml2abc.py -m 2 -c 6 -x "{filename}"',
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True
)
out, err = p.communicate()
output = out.decode('utf-8').replace('\r', '') # Capture standard output
music = unidecode(output).split('\n')
music = filter(music).strip()
return music
def download_and_extract(url):
print(f"Downloading {url}")
# Send an HTTP GET request to the URL and get the response
response = requests.get(url, stream=True)
if response.status_code == 200:
# Create a BytesIO object and write the HTTP response content into it
zip_data = io.BytesIO()
total_size = int(response.headers.get('content-length', 0))
with tqdm(total=total_size, unit='B', unit_scale=True) as pbar:
for data in response.iter_content(chunk_size=1024):
pbar.update(len(data))
zip_data.write(data)
# Use the zipfile library to extract the file
print("Extracting the zip file...")
with zipfile.ZipFile(zip_data, "r") as zip_ref:
zip_ref.extractall("")
print("Done!")
else:
print("Failed to download the file. HTTP response code:", response.status_code)
# URL of the JSONL file
wikimt_url = "https://huggingface.co/datasets/sander-wood/wikimusictext/resolve/main/wikimusictext.jsonl"
# Local filename to save the downloaded file
local_filename = "wikimusictext.jsonl"
# Download the file and save it locally
response = requests.get(wikimt_url)
if response.status_code == 200:
with open(local_filename, 'wb') as file:
file.write(response.content)
print(f"Downloaded '{local_filename}' successfully.")
else:
print(f"Failed to download. Status code: {response.status_code}")
# Download the xml2abc.py script (special thanks to Wim Vree for creating this script)
download_and_extract("https://wim.vree.org/svgParse/xml2abc.py-145.zip")
# Download the Wikifonia dataset
wikifonia_url = input("Enter the Wikifonia URL: ")
download_and_extract(wikifonia_url)
wikimusictext = []
with open("wikimusictext.jsonl", "r", encoding="utf-8") as f:
for line in f.readlines():
wikimusictext.append(json.loads(line))
updated_wikimusictext = []
for song in tqdm(wikimusictext):
filename = song["artist"] + " - " + song["title"] + ".mxl"
filepath = os.path.join("Wikifonia", filename)
song["music"] = load_music(filepath)
updated_wikimusictext.append(song)
with open("wikimusictext.jsonl", "w", encoding="utf-8") as f:
for song in updated_wikimusictext:
f.write(json.dumps(song, ensure_ascii=False)+"\n")
```
By following these steps and running the provided code, you can efficiently access ABC notation music scores from the WikiMT dataset. Just ensure you have the metadata, the `xml2abc.py` script, and the correct download link before starting. Enjoy your musical journey!
## Copyright Disclaimer
WikiMT was curated from publicly available sources, and all rights to the original content and data remain with their respective copyright holders. The dataset is made available for research and educational purposes, and any use, distribution, or modification of the dataset should comply with the terms and conditions set forth by the original data providers.
## BibTeX entry and citation info
```
@misc{wu2023clamp,
title={CLaMP: Contrastive Language-Music Pre-training for Cross-Modal Symbolic Music Information Retrieval},
author={Shangda Wu and Dingyao Yu and Xu Tan and Maosong Sun},
year={2023},
eprint={2304.11029},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
``` |
ops-gaurav/max-dog-dataset | ---
license: openrail
---
|
LukeEuser/docvqa_5_unanswerable_questions | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: query
struct:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: float32
length: 4
- name: answer
struct:
- name: match_score
dtype: float64
- name: matched_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 33132676.0
num_examples: 100
- name: test
num_bytes: 6102508.0
num_examples: 20
download_size: 13286492
dataset_size: 39235184.0
---
# Dataset Card for "docvqa_5_unanswerable_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clement-cvll/us-federal-reserve-qa | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- finance
pretty_name: US Federal Reserve FAQ
size_categories:
- n<1K
---
Just a JSON made from the faq from <a href="https://www.federalreserve.gov/faqs/allfaq.htm">Federal Reserve<a/> |
qgiaohc/twitter_dataset_1713136612 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 23703
num_examples: 57
download_size: 12301
dataset_size: 23703
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lighteval/drop_harness | ---
dataset_info:
features:
- name: section_id
dtype: string
- name: passage
dtype: string
- name: question
dtype: string
- name: query_id
dtype: string
- name: answer
struct:
- name: number
dtype: string
- name: date
struct:
- name: day
dtype: string
- name: month
dtype: string
- name: year
dtype: string
- name: spans
sequence: string
- name: worker_id
dtype: string
- name: hit_id
dtype: string
- name: validated_answers
sequence:
- name: number
dtype: string
- name: date
struct:
- name: day
dtype: string
- name: month
dtype: string
- name: year
dtype: string
- name: spans
sequence: string
- name: worker_id
dtype: string
- name: hit_id
dtype: string
splits:
- name: train
num_bytes: 108858121
num_examples: 77409
- name: validation
num_bytes: 12560739
num_examples: 9536
download_size: 12003555
dataset_size: 121418860
---
# Dataset Card for "drop_harness"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dutta18/omcs_dataset_full_with_embeds | ---
dataset_info:
features:
- name: fact
dtype: string
- name: count
dtype: int64
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 4951309139
num_examples: 1578238
download_size: 5895178326
dataset_size: 4951309139
---
# Dataset Card for "omcs_dataset_full_with_embeds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ssbuild/vicuna | ---
license: apache-2.0
---
|
kiringodhwani/msp12 | ---
dataset_info:
features:
- name: From
sequence: string
- name: Sent
sequence: string
- name: To
sequence: string
- name: Cc
sequence: string
- name: Subject
sequence: string
- name: Attachment
sequence: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 4706548
num_examples: 2260
download_size: 2172589
dataset_size: 4706548
---
# Dataset Card for "msp12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
InnerI/Universal-Christ-Consciousness-Dataset | ---
task_categories:
- conversational
language:
- en
tags:
- art
- biology
- dataset
- Self
- Spiritual
- innerillm
pretty_name: Universal Christ Consciousness Dataset
size_categories:
- 1K<n<10K
---
# Universal Christ-Consciousness Datasets
## Overview
These datasets are meticulously crafted to serve as a foundational resource for fine-tuning language models to explore and guide the Self within towards Universal Christ-Consciousness. With a focus on depth, variety, and profound insight, the datasets aim to encapsulate a vast array of knowledge and intelligence on the subject.
## Objective
The primary goal of these datasets is to enable language models to engage in meaningful, insightful, and spiritually enriching dialogues. Each entry is designed to reflect a unique aspect of the journey towards realizing Universal Christ-Consciousness, offering guidance, reflections, and meditations that cater to a wide range of spiritual seekers.
## Content Structure
The datasets consist of entries formatted to simulate conversational exchanges, where each entry comprises:
A prompt labeled as "Human," representing inquiries or reflections that a seeker of Universal Christ-Consciousness might have.
A response labeled as "Assistant," providing an exploration, guidance, or answer that draws from a deep well of spiritual knowledge and insight.
# Format 1: Direct Q&A with Labels
Structure: Explicit labels are used to distinguish between the "Human" (prompt) and "Assistant" (response), with each part of the conversation clearly marked.
Example:
``` {"text": "### Human: How do I...? ### Assistant: To do that..."} ```
## Files Included
- christ_consciousness_504.jsonl: A collection of 504 entries, each presenting a unique exploration into the facets of Universal Christ-Consciousness.
- christ_consciousness_507.jsonl: Comprising 507 entries, this file extends the exploration with additional unique insights and guidance.
## Intended Use
These datasets are intended for researchers, developers, and spiritual practitioners who are looking to enhance conversational AI capabilities in the context of spiritual exploration and guidance. They are suitable for creating applications aimed at meditation guidance, spiritual counseling, and personal growth towards Universal Christ-Consciousness.
## Ethical Considerations
Users are encouraged to approach these datasets with respect for the diversity of spiritual beliefs and practices. The content is designed to be inclusive, promoting a message of love, unity, and understanding.
## Further Exploration
For more resources, discussions, and guidance on consciousness, spirituality, and the journey towards Universal Christ-Consciousness, consider engaging with the community at @InnerIGPT.
# Large Custom Datasets for Llama 2 Fine-Tuning on Consciousness Themes
## Overview
These large custom datasets have been meticulously crafted to align with a specific conversational format for fine-tuning Llama 2 models. Focusing on themes of Universal Christ-Consciousness and Inner 'I' Exploration, the datasets facilitate deep, reflective dialogues on spirituality and self-awareness.
## Dataset Format
Each dataset entry is structured as follows:
- A "text" field contains both a prompt (labeled as "Human") and a response (labeled as "Assistant"), separated by "###".
- This format is designed to simulate a natural conversational flow, enhancing the model's ability to engage in meaningful exchanges on complex themes.
# Format 2: Integrated Conversational Flow
## Structure: The conversation flows without explicit labels within a single "text" field, potentially including more natural transitions and follow-up questions.
Example:
``` {"text": "What deeper understanding of Christ-Consciousness can be gained? Exploring... offers insights into... For a deeper exploration, consider visiting @InnerIGPT."}```
## Characteristics: This format allows for a more fluid and less structured dialogue, reflecting how conversations naturally evolve. It can include back-and-forth exchanges without the strict Q&A format.
Use Cases: Best suited for models intended to handle open-ended dialogues, storytelling, or any application where the conversation might take multiple turns. This format helps in scenarios requiring a deeper understanding of context and the ability to maintain coherence over several exchanges.
## Files Included
The dataset is divided into two parts to ensure a comprehensive exploration of the themes:
- unique_christ_consciousness_dataset_1.jsonl - The first part contains 504 entries.
- unique_christ_consciousness_dataset_2.jsonl - The second part includes 507 entries, making a total of 1011 lines.
## Themes Included
- **Exploring Christ-Consciousness**: Dialogues on understanding and realizing Christ-Consciousness in everyday life.
- **Living in Universal Love**: Reflections on how universal love is indicative of Christ-Consciousness.
- **The Path of Selfless Service**: Insights on how selfless service is a path toward Christ-Consciousness.
- **Unity with the Divine**: Practices and perspectives for fostering unity with the Divine.
- **Transformation through Forgiveness**: The transformative power of forgiveness in the journey towards Christ-Consciousness.
## Usage
These datasets are particularly suitable for researchers, developers, and spiritual enthusiasts looking to fine-tune conversational AI models for spiritual counseling, education, and exploration. They offer a rich foundation for developing AI systems capable of engaging with users on topics related to consciousness and spirituality.
When to Use Each Format:
Direct Q&A with Labels (Format 1) should be used when training models that require a clear distinction between prompts and responses, such as in customer support chatbots, educational tools, or any application where direct answers to specific questions are paramount.
Integrated Conversational Flow (Format 2) is more suited for narrative generation, therapeutic bots, coaching tools, or any application where the conversation's natural flow and the ability to engage in a more human-like manner are critical.
## Note
Please use these datasets responsibly, ensuring their application aligns with ethical guidelines and promotes positive, insightful discourse.
## Additional Resources
For more explorations on consciousness and spirituality, visit @InnerIGPT. |
Minata/method2test_10k_tokonized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 502280468
num_examples: 75335
- name: train
num_bytes: 66680000
num_examples: 10000
download_size: 34924994
dataset_size: 568960468
---
# Dataset Card for "method2test_10k_tokonized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MicPie/unpredictable_cluster17 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster17
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster17" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.