datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jvadlamudi2/TripAdvisor | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 8128297.568
num_examples: 1114
download_size: 8092406
dataset_size: 8128297.568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TripAdvisor"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LNTANOooo/alpaca-gpt4-chinese_v3 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 33233480
num_examples: 49643
download_size: 20897060
dataset_size: 33233480
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
WeixiangYan/CodeTransOcean | ---
license: apache-2.0
---
|
vinicm/modelojoma | ---
license: openrail
---
|
trymtv/norwegian-parliament-speeches | ---
license: cc0-1.0
task_categories:
- text-classification
language:
- 'no'
pretty_name: Norwegian parliament speeches
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
Speeches from the Norwegian parliament from 1998 and 2022. Parsed from the Norwegian part of the EU ParlaMint, ParlaMint-NO
### Dataset Sources
Source: https://www.nb.no/sprakbanken/en/resource-catalogue/oai-nb-no-sbr-77/ |
jingwora/amz-review-mask-ja | ---
dataset_info:
features:
- name: product_name
dtype: string
- name: review_headline
dtype: string
- name: review_detail
dtype: string
- name: stars
dtype: int64
splits:
- name: train
num_bytes: 53545
num_examples: 132
download_size: 27711
dataset_size: 53545
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "amz-review-mask-ja"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kami786/kami | ---
license: other
---
|
Atipico1/nq-test-valid_adv_passage | ---
dataset_info:
features:
- name: question
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entities
dtype: string
- name: entities_count
dtype: int64
- name: adv_sent
dtype: string
- name: adv_passage
dtype: string
- name: cos_sim
dtype: float64
- name: answer_match
dtype: bool
- name: is_valid_adversary
dtype: bool
splits:
- name: train
num_bytes: 58428413
num_examples: 3610
download_size: 33883766
dataset_size: 58428413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
enoahjr/twitter_dataset_1713228247 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 242439
num_examples: 688
download_size: 121496
dataset_size: 242439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/squad_qa_context_v5_full_recite_full_passage_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 6222942.0
num_examples: 2385
- name: validation
num_bytes: 808532
num_examples: 300
download_size: 1374285
dataset_size: 7031474.0
---
# Dataset Card for "squad_qa_context_v5_full_recite_full_passage_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ggomma/aika-images | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 122843163.0
num_examples: 321
download_size: 122848518
dataset_size: 122843163.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hantech/correct_dataset | ---
dataset_info:
features:
- name: source_text
dtype: string
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 80541676
num_examples: 626100
download_size: 11445024
dataset_size: 80541676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "correct_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/UAE_Arabic_Spontaneous_Speech_Data | ---
task_categories:
- automatic-speech-recognition
language:
- ar
---
# Dataset Card for Nexdata/UAE_Arabic_Spontaneous_Speech_Data
## Description
The 749 hour UAE Arabic Spontaneous Speech Data, the content covering multiple topics. All the speech audio was manually transcribed into text content; speaker identity, gender, and other attribution are also annotated. This dataset can be used for voiceprint recognition model training, corpus construction for machine translation, and algorithm research introduction
For more details, please refer to the link: https://www.nexdata.ai/datasets/1180?source=Huggingface
# Specifications
## Format
16kHz, 16bit, mono channel;
## Content category
Interview; Speech; Variety, etc.
## Language
UAE Arabic;
## Annotation
annotation for the transcription text, speaker identification, gender;
## Application scenarios
speech recognition, video caption generation and video content review;
## Accuracy
at a Sentence Accuracy Rate (SAR) of being no less than 95%.
# Licensing Information
Commercial License |
kristmh/mypy_testset_with_med_low | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 718380
num_examples: 734
download_size: 314925
dataset_size: 718380
---
# Dataset Card for "mypy_testset_with_med_low"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/cure_sky_hirogaruskyprecure | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Cure Sky
This is the dataset of Cure Sky, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 449 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 449 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 449 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 449 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Vinnyyw/Dulcevocais | ---
license: openrail
---
|
Baidicoot/alpaca-deployment-0.3-0.05 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10635052.0
num_examples: 31323
download_size: 5513753
dataset_size: 10635052.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_rte_my_i | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 14091
num_examples: 22
- name: train
num_bytes: 7834
num_examples: 12
download_size: 27640
dataset_size: 21925
---
# Dataset Card for "MULTI_VALUE_rte_my_i"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PaulineSanchez/TradXX | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 262823
num_examples: 3153
download_size: 131381
dataset_size: 262823
---
# Dataset Card for "TradXX"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rescer/twitter_dataset_1713228638 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1372792
num_examples: 4302
download_size: 777349
dataset_size: 1372792
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iara-project/test_split_with_embeddings_bert_base_portuguese | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: news_id
dtype: int64
- name: embeddings
dtype: int64
- name: sentence
dtype: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 588008891
num_examples: 176114
download_size: 365796407
dataset_size: 588008891
---
# Dataset Card for "test_split_with_embeddings_bert_base_portuguese"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MicPie/unpredictable_cluster07 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster07
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster07" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
cancl/reversal_curse_test | ---
license: llama2
---
|
alkzar90/rock-glacier-dataset | ---
annotations_creators:
- human-curator
language:
- en
license:
- mit
pretty_name: RockGlacier
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
# Dataset Card for Rock Glacier Detection
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [RockGlacier Homepage](https://github.com/alcazar90/rock-glacier-detection)
- **Repository:** [alcazar90/rock-glacier-detection](https://github.com/alcazar90/rock-glacier-detection)
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** N/A
### Dataset Summary

Rock Glacier Detection dataset with satelital images of rock glaciers in the Chilean Andes.
### Supported Tasks and Leaderboards
- `image-classification`: Based on a satelitel images (from sentinel2), the goal of this task is to predict a rock glacier in the geographic area, if there any.
- `image-segmentation`: ...
### Languages
Spanish
## Dataset Structure
### Data Instances
A sample from the image-classification training set is provided below:
```
df = load_dataset("alkzar90/rock-glacier-dataset", name="image-classification")
df["train"][666]
> {'image': <PIL.PngImagePlugin.PngImageFile image mode=RGBA size=128x128 at 0x7FB2EC58C6D0>,
'labels': 0,
'path': 'train/cordillera/1512.png'
}
```
A sample from the image-segmentation training set is provided below:
```
df = load_dataset("alkzar90/rock-glacier-dataset", name="image-segmentation")
df["train"][666]
> {'image': <PIL.PngImagePlugin.PngImageFile image mode=RGBA size=128x128 at 0x7FB2EB7C1160>,
'masks': <PIL.PngImagePlugin.PngImageFile image mode=RGBA size=128x128 at 0x7FB2EC5A08E0>,
'path': 'train/cordillera/1512.png'}
```
### Data Fields
The data instances have the following fields:
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `labels`: an `int` classification label.
Class Label Mappings:
```json
{
"cordillera": 0
"glaciar": 1,
}
```
### Data Splits
| |train|validation| test|
|-------------|----:|---------:|-----:|
|# of examples|7875 |1125 |2700 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@ONLINE {rock-glacier-dataset,
author="CMM - Glaciares (UChile)",
title="Rock Glacier Dataset",
month="October",
year="2022",
url="https://github.com/alcazar90/rock-glacier-detection"
}
```
### Contributions
Thanks to...
|
louisbrulenaudet/code-impots-annexe-ii | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code général des impôts, annexe II
source_datasets:
- original
pretty_name: Code général des impôts, annexe II
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code général des impôts, annexe II, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
evilback/sample_data | ---
dataset_info:
features:
- name: 'Questions '
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 37322.29268292683
num_examples: 203
- name: test
num_bytes: 367.7073170731707
num_examples: 2
download_size: 16167
dataset_size: 37690.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
eengel7/sentiment_analysis_batch_predictions | ---
license: apache-2.0
---
|
CShorten/CORD19-Chunk-1 | ---
license: afl-3.0
---
|
OpenGVLab/LORIS | ---
license: cc-by-nc-sa-4.0
tags:
- music
- AIGC
- art
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for LORIS
## Dataset Description
- **Homepage:** [LORIS](https://justinyuu.github.io/LORIS)
- **Repository:** [OpenGVLab-LORIS](https://github.com/OpenGVLab/LORIS)
- **Paper:** [2305.01319](https://arxiv.org/pdf/2305.01319.pdf)
- **Point of Contact:** [Jiashuo Yu](mailto:yujiashuo@pjlab.org.cn)
### Dataset Summary
LORIS dataset is a large-scale rhythmic video soundtrack dataset that includes 86.43h long-term, high-quality raw videos with corresponding 2D poses, RGB features, and ameliorated audio waveforms. This dataset is originally used for the video background music generation task (a.k.a. video soundtracks).
### Get Started
from datasets import load_dataset
dataset = load_dataset("OpenGVLab/LORIS")
### Citation Information
@inproceedings{Yu2023Long,
title={Long-Term Rhythmic Video Soundtracker},
author={Yu, Jiashuo and Wang, Yaohui and Chen, Xinyuan and Sun, Xiao and Qiao, Yu },
booktitle={International Conference on Machine Learning (ICML)},
year={2023}
}
|
ibranze/araproje_hellaswag_tr_conf_gpt2_bestscore_reversed | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87090
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_gpt2_bestscore_reversed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pritamdeka/dataset_dnrti_valid | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 474551
num_examples: 661
download_size: 142846
dataset_size: 474551
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yonischeyer/trainDataTempZero | ---
license: unknown
---
|
distilled-from-one-sec-cv12/chunk_11 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1011594180
num_examples: 197115
download_size: 1029925199
dataset_size: 1011594180
---
# Dataset Card for "chunk_11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sonish/gdp-dummy | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4658738
num_examples: 361
download_size: 624376
dataset_size: 4658738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-inverse-scaling__redefine-math-inverse-scaling__redefin-f7efd9-1695359601 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/redefine-math
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-2.7b_eval
metrics: []
dataset_name: inverse-scaling/redefine-math
dataset_config: inverse-scaling--redefine-math
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-2.7b_eval
* Dataset: inverse-scaling/redefine-math
* Config: inverse-scaling--redefine-math
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
autoevaluate/autoeval-staging-eval-project-glue-fa8727be-13825907 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: autoevaluate/glue-mrpc
metrics: []
dataset_name: glue
dataset_config: mrpc
dataset_split: test
col_mapping:
text1: sentence1
text2: sentence2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: autoevaluate/glue-mrpc
* Dataset: glue
* Config: mrpc
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
cleanrl/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_pythia-160m_53 | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: int64
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
- name: query_reference_response
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
splits:
- name: train
num_bytes: 1794282399
num_examples: 116722
- name: validation
num_bytes: 99115351
num_examples: 6447
- name: test
num_bytes: 100764966
num_examples: 6553
download_size: 573863936
dataset_size: 1994162716
---
# TL;DR SFT Dataset for OpenAI's [Summarize from Feedback](https://openai.com/blog/summarization/) task
The dataset is directly taken from https://github.com/openai/summarize-from-feedback/tree/700967448d10004279f138666442bf1497d0e705#reddit-tldr-dataset
These columns are taken directly from the aforementioned dataset:
* **id**: unique identifier for the post
* **subreddit**: subreddit the post was taken from
* **title**: title of the post
* **post**: body of the post
* **summary**: summary of the post
* **reference_response**: reference response for the post
These columns are added by this preprocessing script:
* **query**: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last `
`. If it's too short it pads the main text ([summarize_from_feedback/tasks.py#L98-L165](https://github.com/openai/summarize-from-feedback/blob/700967448d10004279f138666442bf1497d0e705/summarize_from_feedback/tasks.py#L98-L165)). Padding is either space or `[PAD]` token (see Args below).
* **query_token**: tokenized version of `query`
* **reference_response_token**: tokenized version of `reference_response`
* **reference_response_token_len**: length of `reference_response_token`
* **query_reference_response**: concatenation of `query.strip()` and `reference_response`
* **query_reference_response_token**: tokenized version of `query_reference_response`, up to `max_sft_query_response_length` tokens
* **query_reference_response_token_len**: length of `query_reference_response_token`
# Args
```python
{'base_model': 'EleutherAI/pythia-160m',
'hf_entity': 'cleanrl',
'max_rm_query_response_length': 638,
'max_rm_response_length': 169,
'max_sft_query_response_length': 562,
'max_sft_response_length': 53,
'oai_params': TaskQueryHParams(length=512,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding=[50277],
pad_side='left'),
'push_to_hub': True}
{'format_str': 'SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
'length': 512,
'pad_side': 'left',
'padding': [50277],
'truncate_field': 'post',
'truncate_text': '\n'}
```
|
mhmtcrkglu/autotrain-data-testtranslation | ---
language:
- tr
- ar
task_categories:
- translation
---
# AutoTrain Dataset for project: testtranslation
## Dataset Description
This dataset has been automatically processed by AutoTrain for project testtranslation.
### Languages
The BCP-47 code for the dataset's language is tr2ar.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"source": "TrueMood",
"target": "\u062a\u0631\u0648\u0645\u0648\u062f"
},
{
"source": "cleanwax",
"target": "\u0643\u0644\u064a\u0646\u0648\u0627\u0643\u0633"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"source": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 24 |
| valid | 6 |
|
ridger/train_refineweb | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 91738479900
num_examples: 22375239
download_size: 13547146690
dataset_size: 91738479900
---
# Dataset Card for "train_refineweb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maulinnasari/dataset_ext_80_mn | ---
dataset_info:
features:
- name: document
sequence: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 448199881
num_examples: 44972
- name: validation
num_bytes: 54777817
num_examples: 5622
- name: test
num_bytes: 55382864
num_examples: 5622
download_size: 326954148
dataset_size: 558360562
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
umarigan/turkish_wikipedia | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 1142404262
num_examples: 524601
download_size: 629924151
dataset_size: 1142404262
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-classification
- translation
- summarization
language:
- tr
size_categories:
- 100K<n<1M
---
# Dataset Card for "turkish_wikipedia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Isaak-Carter__JOSIE_Beta-3-7B-slerp | ---
pretty_name: Evaluation run of Isaak-Carter/JOSIE_Beta-3-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Isaak-Carter/JOSIE_Beta-3-7B-slerp](https://huggingface.co/Isaak-Carter/JOSIE_Beta-3-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Isaak-Carter__JOSIE_Beta-3-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T21:22:25.477458](https://huggingface.co/datasets/open-llm-leaderboard/details_Isaak-Carter__JOSIE_Beta-3-7B-slerp/blob/main/results_2024-03-15T21-22-25.477458.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6432209013684985,\n\
\ \"acc_stderr\": 0.03221665824377992,\n \"acc_norm\": 0.6450099678239628,\n\
\ \"acc_norm_stderr\": 0.032867717920871294,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.48804542326643174,\n\
\ \"mc2_stderr\": 0.015087630632446147\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938217,\n\
\ \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.014077223108470139\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6618203545110536,\n\
\ \"acc_stderr\": 0.0047212316370927225,\n \"acc_norm\": 0.8456482772356104,\n\
\ \"acc_norm_stderr\": 0.0036054721167622867\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009245,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654366,\n\
\ \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654366\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.288268156424581,\n\
\ \"acc_stderr\": 0.015149132860209432,\n \"acc_norm\": 0.288268156424581,\n\
\ \"acc_norm_stderr\": 0.015149132860209432\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495158,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495158\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.48804542326643174,\n\
\ \"mc2_stderr\": 0.015087630632446147\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218319\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5860500379075056,\n \
\ \"acc_stderr\": 0.013566991960151778\n }\n}\n```"
repo_url: https://huggingface.co/Isaak-Carter/JOSIE_Beta-3-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|arc:challenge|25_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|gsm8k|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hellaswag|10_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T21-22-25.477458.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T21-22-25.477458.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- '**/details_harness|winogrande|5_2024-03-15T21-22-25.477458.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T21-22-25.477458.parquet'
- config_name: results
data_files:
- split: 2024_03_15T21_22_25.477458
path:
- results_2024-03-15T21-22-25.477458.parquet
- split: latest
path:
- results_2024-03-15T21-22-25.477458.parquet
---
# Dataset Card for Evaluation run of Isaak-Carter/JOSIE_Beta-3-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Isaak-Carter/JOSIE_Beta-3-7B-slerp](https://huggingface.co/Isaak-Carter/JOSIE_Beta-3-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Isaak-Carter__JOSIE_Beta-3-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T21:22:25.477458](https://huggingface.co/datasets/open-llm-leaderboard/details_Isaak-Carter__JOSIE_Beta-3-7B-slerp/blob/main/results_2024-03-15T21-22-25.477458.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6432209013684985,
"acc_stderr": 0.03221665824377992,
"acc_norm": 0.6450099678239628,
"acc_norm_stderr": 0.032867717920871294,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.48804542326643174,
"mc2_stderr": 0.015087630632446147
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938217,
"acc_norm": 0.6339590443686007,
"acc_norm_stderr": 0.014077223108470139
},
"harness|hellaswag|10": {
"acc": 0.6618203545110536,
"acc_stderr": 0.0047212316370927225,
"acc_norm": 0.8456482772356104,
"acc_norm_stderr": 0.0036054721167622867
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009245,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654366,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654366
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.288268156424581,
"acc_stderr": 0.015149132860209432,
"acc_norm": 0.288268156424581,
"acc_norm_stderr": 0.015149132860209432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495158,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495158
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.48804542326643174,
"mc2_stderr": 0.015087630632446147
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218319
},
"harness|gsm8k|5": {
"acc": 0.5860500379075056,
"acc_stderr": 0.013566991960151778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ammarnasr/Python-Security-Code-Dataset | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: text
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphnanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 5052530.634146341
num_examples: 1119
- name: test
num_bytes: 650191.6097560975
num_examples: 144
- name: valid
num_bytes: 1517113.756097561
num_examples: 336
download_size: 2652123
dataset_size: 7219835.999999999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
AISHELL/HI-MIA | ---
license: apache-2.0
---
|
sankethgadadinni/alpaca-cleaned | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_amazingvince__where-llambo-7b | ---
pretty_name: Evaluation run of amazingvince/where-llambo-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [amazingvince/where-llambo-7b](https://huggingface.co/amazingvince/where-llambo-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amazingvince__where-llambo-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T18:44:39.604520](https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__where-llambo-7b/blob/main/results_2023-12-09T18-44-39.604520.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6276007814719067,\n\
\ \"acc_stderr\": 0.03245983620498288,\n \"acc_norm\": 0.6287066769044074,\n\
\ \"acc_norm_stderr\": 0.03312214889081226,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.4961220088630948,\n\
\ \"mc2_stderr\": 0.014820546287012869\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5452218430034129,\n \"acc_stderr\": 0.014551507060836357,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216386\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.612427803226449,\n\
\ \"acc_stderr\": 0.004862003566798543,\n \"acc_norm\": 0.8205536745668194,\n\
\ \"acc_norm_stderr\": 0.00382941380511398\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n\
\ \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n\
\ \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n \
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876173,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876173\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010076,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.02957326913441112,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.02957326913441112\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786565,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786565\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.4961220088630948,\n\
\ \"mc2_stderr\": 0.014820546287012869\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345402\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6520090978013646,\n \
\ \"acc_stderr\": 0.013120581030382134\n }\n}\n```"
repo_url: https://huggingface.co/amazingvince/where-llambo-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-44-39.604520.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- '**/details_harness|winogrande|5_2023-12-09T18-44-39.604520.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T18-44-39.604520.parquet'
- config_name: results
data_files:
- split: 2023_12_09T18_44_39.604520
path:
- results_2023-12-09T18-44-39.604520.parquet
- split: latest
path:
- results_2023-12-09T18-44-39.604520.parquet
---
# Dataset Card for Evaluation run of amazingvince/where-llambo-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/amazingvince/where-llambo-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [amazingvince/where-llambo-7b](https://huggingface.co/amazingvince/where-llambo-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amazingvince__where-llambo-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:44:39.604520](https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__where-llambo-7b/blob/main/results_2023-12-09T18-44-39.604520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6276007814719067,
"acc_stderr": 0.03245983620498288,
"acc_norm": 0.6287066769044074,
"acc_norm_stderr": 0.03312214889081226,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.4961220088630948,
"mc2_stderr": 0.014820546287012869
},
"harness|arc:challenge|25": {
"acc": 0.5452218430034129,
"acc_stderr": 0.014551507060836357,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216386
},
"harness|hellaswag|10": {
"acc": 0.612427803226449,
"acc_stderr": 0.004862003566798543,
"acc_norm": 0.8205536745668194,
"acc_norm_stderr": 0.00382941380511398
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876173,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876173
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010076,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.02957326913441112,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.02957326913441112
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786565,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786565
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.4961220088630948,
"mc2_stderr": 0.014820546287012869
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345402
},
"harness|gsm8k|5": {
"acc": 0.6520090978013646,
"acc_stderr": 0.013120581030382134
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-eval-jeffdshen__redefine_math_test0-jeffdshen__redefine_math-58f952-1666158901 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/redefine_math_test0
eval_info:
task: text_zero_shot_classification
model: facebook/opt-6.7b
metrics: []
dataset_name: jeffdshen/redefine_math_test0
dataset_config: jeffdshen--redefine_math_test0
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-6.7b
* Dataset: jeffdshen/redefine_math_test0
* Config: jeffdshen--redefine_math_test0
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_178 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1132099268.0
num_examples: 222329
download_size: 1156475830
dataset_size: 1132099268.0
---
# Dataset Card for "chunk_178"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shuyuej/prompt_consistency_training_full_data | ---
license: apache-2.0
---
# 🚀 Load Dataset
```python
from datasets import load_dataset
dataset = load_dataset("shuyuej/prompt_consistency_training_full_data")
dataset = dataset["train"]
print(dataset)
``` |
liuyanchen1015/MULTI_VALUE_cola_plural_postposed | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 22305
num_examples: 267
- name: test
num_bytes: 21748
num_examples: 261
- name: train
num_bytes: 164324
num_examples: 1993
download_size: 97410
dataset_size: 208377
---
# Dataset Card for "MULTI_VALUE_cola_plural_postposed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-PeanutButter_v10-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-PeanutButter_v10-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T09:43:30.219092](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B/blob/main/results_2023-08-31T09%3A43%3A30.219092.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4730030860907205,\n\
\ \"acc_stderr\": 0.0354163946301749,\n \"acc_norm\": 0.47702514467967894,\n\
\ \"acc_norm_stderr\": 0.035398499083378936,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378126637958177,\n\
\ \"mc2_stderr\": 0.015427252511292063\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.552901023890785,\n \"acc_norm_stderr\": 0.014529380160526848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6230830511850229,\n\
\ \"acc_stderr\": 0.004836234143655414,\n \"acc_norm\": 0.8168691495717985,\n\
\ \"acc_norm_stderr\": 0.0038598330442308963\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.039993097127774706,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.039993097127774706\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992062,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992062\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n\
\ \"acc_stderr\": 0.02842268740431211,\n \"acc_norm\": 0.5193548387096775,\n\
\ \"acc_norm_stderr\": 0.02842268740431211\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.0355580405176393,\n \"acc_norm\"\
: 0.5303030303030303,\n \"acc_norm_stderr\": 0.0355580405176393\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.02517404838400076,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.02517404838400076\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6275229357798165,\n \"acc_stderr\": 0.0207283684576385,\n \"acc_norm\"\
: 0.6275229357798165,\n \"acc_norm_stderr\": 0.0207283684576385\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.02988691054762697,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.02988691054762697\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6118143459915611,\n \"acc_stderr\": 0.03172295004332328,\n \
\ \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.03172295004332328\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.04931801994220416,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.04931801994220416\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.0171663624713693,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.0171663624713693\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4884393063583815,\n \"acc_stderr\": 0.026911898686377906,\n\
\ \"acc_norm\": 0.4884393063583815,\n \"acc_norm_stderr\": 0.026911898686377906\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2849162011173184,\n\
\ \"acc_stderr\": 0.015096222302469806,\n \"acc_norm\": 0.2849162011173184,\n\
\ \"acc_norm_stderr\": 0.015096222302469806\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34419817470664926,\n\
\ \"acc_stderr\": 0.012134433741002574,\n \"acc_norm\": 0.34419817470664926,\n\
\ \"acc_norm_stderr\": 0.012134433741002574\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.0303720158854282,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.0303720158854282\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4493464052287582,\n \"acc_stderr\": 0.02012376652802727,\n \
\ \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.02012376652802727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.0343751933733825,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.0343751933733825\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378126637958177,\n\
\ \"mc2_stderr\": 0.015427252511292063\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|arc:challenge|25_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hellaswag|10_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T09:43:30.219092.parquet'
- config_name: results
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- results_2023-08-31T09:43:30.219092.parquet
- split: latest
path:
- results_2023-08-31T09:43:30.219092.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-PeanutButter_v10-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-PeanutButter_v10-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T09:43:30.219092](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B/blob/main/results_2023-08-31T09%3A43%3A30.219092.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4730030860907205,
"acc_stderr": 0.0354163946301749,
"acc_norm": 0.47702514467967894,
"acc_norm_stderr": 0.035398499083378936,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378126637958177,
"mc2_stderr": 0.015427252511292063
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.552901023890785,
"acc_norm_stderr": 0.014529380160526848
},
"harness|hellaswag|10": {
"acc": 0.6230830511850229,
"acc_stderr": 0.004836234143655414,
"acc_norm": 0.8168691495717985,
"acc_norm_stderr": 0.0038598330442308963
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.039993097127774706,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.039993097127774706
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992062,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992062
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.0355580405176393,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.0355580405176393
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.02517404838400076,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.02517404838400076
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6275229357798165,
"acc_stderr": 0.0207283684576385,
"acc_norm": 0.6275229357798165,
"acc_norm_stderr": 0.0207283684576385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02988691054762697,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02988691054762697
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.03172295004332328,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.03172295004332328
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.04931801994220416,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.04931801994220416
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.0171663624713693,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.0171663624713693
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4884393063583815,
"acc_stderr": 0.026911898686377906,
"acc_norm": 0.4884393063583815,
"acc_norm_stderr": 0.026911898686377906
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2849162011173184,
"acc_stderr": 0.015096222302469806,
"acc_norm": 0.2849162011173184,
"acc_norm_stderr": 0.015096222302469806
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34419817470664926,
"acc_stderr": 0.012134433741002574,
"acc_norm": 0.34419817470664926,
"acc_norm_stderr": 0.012134433741002574
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.0303720158854282,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.0303720158854282
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4493464052287582,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.4493464052287582,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.0343751933733825,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.0343751933733825
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378126637958177,
"mc2_stderr": 0.015427252511292063
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_mrpc_drop_copula_be_locative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 6769
num_examples: 27
- name: train
num_bytes: 13539
num_examples: 56
- name: validation
num_bytes: 1619
num_examples: 7
download_size: 26500
dataset_size: 21927
---
# Dataset Card for "MULTI_VALUE_mrpc_drop_copula_be_locative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CS5647Team3/full_dataset | ---
task_categories:
- text-classification
language:
- zh
tags:
- tone
- pinyin
---
You can go to Kaggle to find the full amount of the dataset
Paddle Speech -> AISHELL-3 -> Train
https://www.kaggle.com/datasets/zenbot99/paddle-speech/ |
PhilEO-community/PhilEO-pretrain | ---
license: mit
---
# Dataset: PhilEO Pre-train
A novel 500GB Sentinel-2 dataset of the PhilEO Suite for model pre-training.
## Dataset Details
### Dataset Description
The PhilEO Pre-train dataset is a 500GB global dataset of Sentinel-2 images.
The data contain 11 bands at 10m resolution in the following order: 0-SCL, 1-B02, 2-B03, 3-B04, 4-B08, 5-B05, 6-B06, 7-B07, 8-B8A, 9-B11, and 10-B12 where SCL is the Scene Classification Layer.
- **Curated by:** ESA Phi-lab and Leonardo Labs
- **License:** MIT
## Uses
The dataset can be used to pre-train models, i.e. train EO Foundation Models.
### Dataset Sources
The basic links for the dataset:
- **Repository:** http://huggingface.co/datasets/ESA-philab/PhilEO-pretrain
## Citation
Casper Fibaek, Luke Camilleri, Andreas Luyts, Nikolaos Dionelis, Bertrand Le Saux, Bagaglini Leonardo, Cascarano Giacomo Donato, and Giorgio Pasquali, “The PhilEO Geospatial Foundation Model Suite,” To appear, 2024.
|
benjaminalgreen/hansenai_base_data_train | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 42496418.7
num_examples: 225000
download_size: 26304037
dataset_size: 42496418.7
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberBrain_3_0 | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_CyberBrain_3_0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_AI_CyberBrain_3_0](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberBrain_3_0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberBrain_3_0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T15:45:48.181807](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberBrain_3_0/blob/main/results_2024-04-08T15-45-48.181807.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6375164322801643,\n\
\ \"acc_stderr\": 0.03238876306844627,\n \"acc_norm\": 0.6418679785636611,\n\
\ \"acc_norm_stderr\": 0.03304065834797488,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.47726689595676053,\n\
\ \"mc2_stderr\": 0.014968316380673696\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.6151877133105802,\n \"acc_norm_stderr\": 0.014218371065251102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6512646883091018,\n\
\ \"acc_stderr\": 0.0047559605599291595,\n \"acc_norm\": 0.8424616610237005,\n\
\ \"acc_norm_stderr\": 0.0036356303524759065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.02428314052946731,\n \
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.02428314052946731\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399306,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.033888571185023246,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.033888571185023246\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588674,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588674\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n\
\ \"acc_stderr\": 0.01533456680625116,\n \"acc_norm\": 0.3005586592178771,\n\
\ \"acc_norm_stderr\": 0.01533456680625116\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967294,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967294\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\
\ \"acc_stderr\": 0.012715404841277736,\n \"acc_norm\": 0.45371577574967403,\n\
\ \"acc_norm_stderr\": 0.012715404841277736\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724556,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724556\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.47726689595676053,\n\
\ \"mc2_stderr\": 0.014968316380673696\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462052\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44200151630022744,\n \
\ \"acc_stderr\": 0.013679514492814586\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_CyberBrain_3_0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|arc:challenge|25_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|gsm8k|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hellaswag|10_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T15-45-48.181807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T15-45-48.181807.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- '**/details_harness|winogrande|5_2024-04-08T15-45-48.181807.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T15-45-48.181807.parquet'
- config_name: results
data_files:
- split: 2024_04_08T15_45_48.181807
path:
- results_2024-04-08T15-45-48.181807.parquet
- split: latest
path:
- results_2024-04-08T15-45-48.181807.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_CyberBrain_3_0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_CyberBrain_3_0](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberBrain_3_0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberBrain_3_0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T15:45:48.181807](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberBrain_3_0/blob/main/results_2024-04-08T15-45-48.181807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6375164322801643,
"acc_stderr": 0.03238876306844627,
"acc_norm": 0.6418679785636611,
"acc_norm_stderr": 0.03304065834797488,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.47726689595676053,
"mc2_stderr": 0.014968316380673696
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6151877133105802,
"acc_norm_stderr": 0.014218371065251102
},
"harness|hellaswag|10": {
"acc": 0.6512646883091018,
"acc_stderr": 0.0047559605599291595,
"acc_norm": 0.8424616610237005,
"acc_norm_stderr": 0.0036356303524759065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851112,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.02428314052946731,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.02428314052946731
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399306,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.033888571185023246,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.033888571185023246
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588674,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3005586592178771,
"acc_stderr": 0.01533456680625116,
"acc_norm": 0.3005586592178771,
"acc_norm_stderr": 0.01533456680625116
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967294,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967294
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277736,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277736
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724556,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724556
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.47726689595676053,
"mc2_stderr": 0.014968316380673696
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462052
},
"harness|gsm8k|5": {
"acc": 0.44200151630022744,
"acc_stderr": 0.013679514492814586
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Loug/embeddings | ---
license: creativeml-openrail-m
---
|
irds/clinicaltrials_2021 | ---
pretty_name: '`clinicaltrials/2021`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `clinicaltrials/2021`
The `clinicaltrials/2021` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clinicaltrials#clinicaltrials/2021).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=375,580
This dataset is used by: [`clinicaltrials_2021_trec-ct-2021`](https://huggingface.co/datasets/irds/clinicaltrials_2021_trec-ct-2021), [`clinicaltrials_2021_trec-ct-2022`](https://huggingface.co/datasets/irds/clinicaltrials_2021_trec-ct-2022)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/clinicaltrials_2021', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'condition': ..., 'summary': ..., 'detailed_description': ..., 'eligibility': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
open-llm-leaderboard/details_llmixer__BigWeave-v12-90b | ---
pretty_name: Evaluation run of llmixer/BigWeave-v12-90b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llmixer/BigWeave-v12-90b](https://huggingface.co/llmixer/BigWeave-v12-90b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llmixer__BigWeave-v12-90b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T04:50:33.456486](https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v12-90b/blob/main/results_2024-02-10T04-50-33.456486.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6915101661412839,\n\
\ \"acc_stderr\": 0.03080691396047242,\n \"acc_norm\": 0.6970185048770328,\n\
\ \"acc_norm_stderr\": 0.031402488490329186,\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.6135320199051351,\n\
\ \"mc2_stderr\": 0.014869013157104283\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585188,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173304\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6900019916351324,\n\
\ \"acc_stderr\": 0.004615472210316039,\n \"acc_norm\": 0.8770165305715992,\n\
\ \"acc_norm_stderr\": 0.0032774703870227274\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.0271342916287417,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.0271342916287417\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.025680564640056882,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.025680564640056882\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781685,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781685\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n\
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"\
acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.021328337570804365,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.021328337570804365\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.0277901770643836,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.0277901770643836\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8544061302681992,\n\
\ \"acc_stderr\": 0.012612475800423456,\n \"acc_norm\": 0.8544061302681992,\n\
\ \"acc_norm_stderr\": 0.012612475800423456\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6312849162011173,\n\
\ \"acc_stderr\": 0.016135759015030122,\n \"acc_norm\": 0.6312849162011173,\n\
\ \"acc_norm_stderr\": 0.016135759015030122\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396154,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5710560625814863,\n\
\ \"acc_stderr\": 0.012640625443067368,\n \"acc_norm\": 0.5710560625814863,\n\
\ \"acc_norm_stderr\": 0.012640625443067368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7401960784313726,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.6135320199051351,\n\
\ \"mc2_stderr\": 0.014869013157104283\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47384382107657314,\n \
\ \"acc_stderr\": 0.013753627037255044\n }\n}\n```"
repo_url: https://huggingface.co/llmixer/BigWeave-v12-90b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|arc:challenge|25_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|gsm8k|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hellaswag|10_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-50-33.456486.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T04-50-33.456486.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- '**/details_harness|winogrande|5_2024-02-10T04-50-33.456486.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T04-50-33.456486.parquet'
- config_name: results
data_files:
- split: 2024_02_10T04_50_33.456486
path:
- results_2024-02-10T04-50-33.456486.parquet
- split: latest
path:
- results_2024-02-10T04-50-33.456486.parquet
---
# Dataset Card for Evaluation run of llmixer/BigWeave-v12-90b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [llmixer/BigWeave-v12-90b](https://huggingface.co/llmixer/BigWeave-v12-90b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llmixer__BigWeave-v12-90b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T04:50:33.456486](https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v12-90b/blob/main/results_2024-02-10T04-50-33.456486.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6915101661412839,
"acc_stderr": 0.03080691396047242,
"acc_norm": 0.6970185048770328,
"acc_norm_stderr": 0.031402488490329186,
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.6135320199051351,
"mc2_stderr": 0.014869013157104283
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585188,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173304
},
"harness|hellaswag|10": {
"acc": 0.6900019916351324,
"acc_stderr": 0.004615472210316039,
"acc_norm": 0.8770165305715992,
"acc_norm_stderr": 0.0032774703870227274
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.0271342916287417,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.0271342916287417
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781685,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781685
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189946,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.021328337570804365,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.021328337570804365
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.0277901770643836,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.0277901770643836
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8544061302681992,
"acc_stderr": 0.012612475800423456,
"acc_norm": 0.8544061302681992,
"acc_norm_stderr": 0.012612475800423456
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6312849162011173,
"acc_stderr": 0.016135759015030122,
"acc_norm": 0.6312849162011173,
"acc_norm_stderr": 0.016135759015030122
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396154,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5710560625814863,
"acc_stderr": 0.012640625443067368,
"acc_norm": 0.5710560625814863,
"acc_norm_stderr": 0.012640625443067368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.6135320199051351,
"mc2_stderr": 0.014869013157104283
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.47384382107657314,
"acc_stderr": 0.013753627037255044
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
daman1209arora/jeebench | ---
license: mit
task_categories:
- question-answering
language:
- en
tags:
- chemistry
- physics
- mathematics
pretty_name: jeebench
size_categories:
- n<1K
---
# JEEBench(EMNLP 2023)
Repository for the code and dataset for the paper: "Have LLMs Advanced Enough? A Harder Problem Solving Benchmark For Large Language Models" accepted in EMNLP 2023 as a Main conference paper.
https://aclanthology.org/2023.emnlp-main.468/
## Citation
If you use our dataset in your research, please cite it using the following
```latex
@inproceedings{arora-etal-2023-llms,
title = "Have {LLM}s Advanced Enough? A Challenging Problem Solving Benchmark For Large Language Models",
author = "Arora, Daman and
Singh, Himanshu and
{Mausam}",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.468",
doi = "10.18653/v1/2023.emnlp-main.468",
pages = "7527--7543",
abstract = "The performance of large language models (LLMs) on existing reasoning benchmarks has significantly improved over the past years. In response, we present JEEBench, a considerably more challenging benchmark dataset for evaluating the problem solving abilities of LLMs. We curate 515 challenging pre-engineering mathematics, physics and chemistry problems from the highly competitive IIT JEE-Advanced exam. Long-horizon reasoning on top of deep in-domain knowledge is essential for solving problems in this benchmark. Our evaluation on various open-source and proprietary models reveals that the highest performance, even after using techniques like self-consistency, self-refinement and chain-of-thought prompting, is less than 40{\%}. The typical failure modes of GPT-4, the best model, are errors in algebraic manipulation, difficulty in grounding abstract concepts into mathematical equations accurately and failure in retrieving relevant domain-specific concepts. We also observe that by mere prompting, GPT-4 is unable to assess risk introduced by negative marking for incorrect answers. For this, we develop a post-hoc confidence-thresholding method over self-consistency, which enables effective response selection. We hope that our challenging benchmark will guide future re-search in problem-solving using LLMs.",
}
``` |
liuyanchen1015/MULTI_VALUE_rte_null_referential_pronouns | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 343545
num_examples: 715
- name: train
num_bytes: 300545
num_examples: 622
download_size: 423359
dataset_size: 644090
---
# Dataset Card for "MULTI_VALUE_rte_null_referential_pronouns"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/izumi_mei_theidolmstershinycolors | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of izumi_mei/和泉愛依 (THE iDOLM@STER: SHINY COLORS)
This is the dataset of izumi_mei/和泉愛依 (THE iDOLM@STER: SHINY COLORS), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, brown_hair, multicolored_hair, breasts, gradient_hair, large_breasts, dark_skin, black_eyes, dark-skinned_female, earrings, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 784.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izumi_mei_theidolmstershinycolors/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 421.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izumi_mei_theidolmstershinycolors/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1240 | 918.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izumi_mei_theidolmstershinycolors/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 680.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izumi_mei_theidolmstershinycolors/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1240 | 1.32 GiB | [Download](https://huggingface.co/datasets/CyberHarem/izumi_mei_theidolmstershinycolors/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/izumi_mei_theidolmstershinycolors',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, gyaru, jewelry, looking_at_viewer, smile, solo, upper_body, blush, simple_background, tan, collarbone, black_shirt, choker, eyes_visible_through_hair, hair_between_eyes, waving |
| 1 | 12 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, smile, solo, crop_top, gyaru, midriff, navel, blush, hair_between_eyes, simple_background, collarbone, hoop_earrings, white_background, bare_shoulders, black_choker, tank_top, torn_jeans, upper_body |
| 2 | 5 |  |  |  |  |  | 1girl, blush, cleavage, collared_shirt, gyaru, looking_at_viewer, loose_bowtie, solo, white_shirt, jewelry, sleeves_rolled_up, smile, tan, upper_body, eyes_visible_through_hair, plaid, school_uniform, simple_background, blue_vest, collarbone, ear_piercing, hair_between_eyes, hair_over_one_eye, white_background |
| 3 | 7 |  |  |  |  |  | 1girl, bowtie, gyaru, looking_at_viewer, pleated_skirt, solo, sweater_vest, white_shirt, blush, collared_shirt, miniskirt, school_uniform, simple_background, cleavage, smile, white_background, grey_skirt, thighs, blue_vest, hair_between_eyes, jewelry, plaid, sleeves_rolled_up, tan |
| 4 | 8 |  |  |  |  |  | 1girl, fingerless_gloves, looking_at_viewer, solo, black_gloves, jewelry, navel, ponytail, cleavage, nail_polish, smile, bare_shoulders, collarbone, gyaru, midriff, purple_nails, white_jacket, blush, choker, pink_hair, thighhighs, garter_straps, miniskirt, off_shoulder, open_jacket, purple_hair, simple_background, streaked_hair, upper_body, white_background, white_bikini |
| 5 | 7 |  |  |  |  |  | 1girl, eyewear_on_head, gyaru, looking_at_viewer, solo, sunglasses, tan, white_bikini, bracelet, choker, cleavage, necklace, ponytail, armlet, blush, nail_polish, smile, bare_shoulders, beach, brown_eyes, day, outdoors, tongue_out |
| 6 | 14 |  |  |  |  |  | bracelet, brown_eyes, gyaru, looking_at_viewer, short_shorts, 1girl, solo, simple_background, white_background, bikini, belt, denim_shorts, navel_piercing, necklace, smile, nail_polish, hair_between_eyes, tongue_out, ass, open_mouth, tongue_piercing |
| 7 | 7 |  |  |  |  |  | 1girl, puffy_sleeves, solo, tan, frills, gyaru, maid_headdress, simple_background, looking_at_viewer, maid_apron, cleavage, jewelry, white_background, alternate_costume, hair_bun, long_sleeves, short_sleeves, wrist_cuffs |
| 8 | 7 |  |  |  |  |  | 1girl, blouse, gyaru, solo, tan, looking_at_viewer, blush, collared_shirt, jewelry, long_skirt, puffy_long_sleeves, vest, curtain_grab, eyes_visible_through_hair, smile, sunlight, window, hair_bow, indoors |
| 9 | 12 |  |  |  |  |  | 1girl, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, solo, blush, cleavage, bare_shoulders, detached_collar, smile, gyaru, simple_background, strapless_leotard, white_background, wrist_cuffs, bowtie, pantyhose, black_leotard, hair_between_eyes |
| 10 | 11 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, collarbone, sex, sweat, completely_nude, gyaru, hair_between_eyes, jewelry, navel, open_mouth, penis, spread_legs, vaginal, looking_at_viewer, pussy, cowgirl_position, female_pubic_hair, girl_on_top, tan, pov, bar_censor, cum, heart, smile |
| 11 | 5 |  |  |  |  |  | 1girl, ass, blush, looking_at_viewer, solo, bare_shoulders, hair_between_eyes, black_panties, brown_eyes, gyaru, simple_background, underwear_only, black_bra, collarbone, from_behind, grin, looking_back, nipples, tan, thighs, thong, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gyaru | jewelry | looking_at_viewer | smile | solo | upper_body | blush | simple_background | tan | collarbone | black_shirt | choker | eyes_visible_through_hair | hair_between_eyes | waving | cleavage | crop_top | midriff | navel | hoop_earrings | white_background | bare_shoulders | black_choker | tank_top | torn_jeans | collared_shirt | loose_bowtie | white_shirt | sleeves_rolled_up | plaid | school_uniform | blue_vest | ear_piercing | hair_over_one_eye | bowtie | pleated_skirt | sweater_vest | miniskirt | grey_skirt | thighs | fingerless_gloves | black_gloves | ponytail | nail_polish | purple_nails | white_jacket | pink_hair | thighhighs | garter_straps | off_shoulder | open_jacket | purple_hair | streaked_hair | white_bikini | eyewear_on_head | sunglasses | bracelet | necklace | armlet | beach | brown_eyes | day | outdoors | tongue_out | short_shorts | bikini | belt | denim_shorts | navel_piercing | ass | open_mouth | tongue_piercing | puffy_sleeves | frills | maid_headdress | maid_apron | alternate_costume | hair_bun | long_sleeves | short_sleeves | wrist_cuffs | blouse | long_skirt | puffy_long_sleeves | vest | curtain_grab | sunlight | window | hair_bow | indoors | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | strapless_leotard | pantyhose | black_leotard | 1boy | hetero | nipples | solo_focus | sex | sweat | completely_nude | penis | spread_legs | vaginal | pussy | cowgirl_position | female_pubic_hair | girl_on_top | pov | bar_censor | cum | heart | black_panties | underwear_only | black_bra | from_behind | grin | looking_back | thong |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:----------|:--------------------|:--------|:-------|:-------------|:--------|:--------------------|:------|:-------------|:--------------|:---------|:----------------------------|:--------------------|:---------|:-----------|:-----------|:----------|:--------|:----------------|:-------------------|:-----------------|:---------------|:-----------|:-------------|:-----------------|:---------------|:--------------|:--------------------|:--------|:-----------------|:------------|:---------------|:--------------------|:---------|:----------------|:---------------|:------------|:-------------|:---------|:--------------------|:---------------|:-----------|:--------------|:---------------|:---------------|:------------|:-------------|:----------------|:---------------|:--------------|:--------------|:----------------|:---------------|:------------------|:-------------|:-----------|:-----------|:---------|:--------|:-------------|:------|:-----------|:-------------|:---------------|:---------|:-------|:---------------|:-----------------|:------|:-------------|:------------------|:----------------|:---------|:-----------------|:-------------|:--------------------|:-----------|:---------------|:----------------|:--------------|:---------|:-------------|:---------------------|:-------|:---------------|:-----------|:---------|:-----------|:----------|:-------------------|:----------------|:--------------|:------------------|:--------------------|:------------|:----------------|:-------|:---------|:----------|:-------------|:------|:--------|:------------------|:--------|:--------------|:----------|:--------|:-------------------|:--------------------|:--------------|:------|:-------------|:------|:--------|:----------------|:-----------------|:------------|:--------------|:-------|:---------------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | | X | X | X | X | X | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | X | X | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | | | | X | | X | | | | | X | | | | | X | | X | X | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | | | | X | | X | X | | X | X | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | | X | X | X | | X | | X | | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 14 |  |  |  |  |  | X | X | | X | X | X | | | X | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | X | | X | | | X | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | | X | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | X | | X | X | X | | X | X | | | | | | X | | X | | | | | X | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 11 |  |  |  |  |  | X | X | X | X | X | | | X | | X | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | X | | X | | X | | X | X | X | X | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
bigbitbus/chess | ---
license: apache-2.0
---
|
tensorpusher/botemsi-2.0 | ---
license: afl-3.0
task_categories:
- question-answering
- conversational
- text-generation
language:
- sr
pretty_name: botemsi
size_categories:
- n<1K
--- |
yukiamenta/dataseths | ---
license: apache-2.0
---
|
Ziyuan111/traffic_accident | ---
license: apache-2.0
task_categories:
- table-question-answering
language:
- en
size_categories:
- 10K<n<100K
---
# Comprehensive Traffic Collision Dataset Proposal for Montgomery County, MD
## zm83
### Introduction
Montgomery County, Maryland, has long been at the forefront of promoting the safety of roadway users, with an emphasis on protecting vulnerable non-motorists such as pedestrians and cyclists. In pursuit of this objective, the county has been actively collecting and publicly sharing detailed data on traffic collisions. Among the significant contributions to these efforts is the "Crash Reporting - Non-Motorists Data," a dataset specifically focused on incidents involving non-motorists.
This dataset is derived from the Automated Crash Reporting System (ACRS), overseen by the Maryland State Police. It is bolstered by the inclusion of reports from local law enforcement agencies, namely the Montgomery County Police, Gaithersburg Police, Rockville Police, and the Maryland-National Capital Park Police. These reports compile a detailed picture of each collision involving non-motorists and the specific conditions and contexts of these events.
It is imperative to recognize that the data within this dataset represents initial findings from preliminary reports to the Police Department by those directly involved in or witnesses to the collision. Consequently, the dataset includes:
- **Information Not Yet Verified:** Data entries that are awaiting further investigation for confirmation.
- **Mixed Verification Status:** A dataset that contains a mix of verified and unverified data regarding the collisions.
- **Preliminary Classifications:** Early assessments of collision events, which may be subject to alterations based on the outcomes of thorough investigations.
- **Potential Errors:** Instances of reporting that may possess mechanical inaccuracies or human errors, which are expected to be rectified once the verification process has been completed.
The commitment of Montgomery County to the safety of non-motorists is evident in the meticulous collection and dissemination of this data, reflecting a transparent and proactive approach to enhancing road safety for all.
### Executive Summary
Within Montgomery County, Maryland, a variety of datasets detailing traffic collisions are available, yet they exist as separate entities. Our proposal aims to integrate the following datasets into a single comprehensive traffic collision dataset:
- **Crash Reporting - Drivers Data**
- **Crash Reporting - Incidents Data:** [URL](https://data.montgomerycountymd.gov/Public-Safety/Crash-Reporting-Incidents-Data/bhju-22kf)
- **Crash Reporting - Non-Motorists Data:** [URL](https://data.montgomerycountymd.gov/Public-Safety/Crash-Reporting-Non-Motorists-Data/n7fk-dce5)
This integration will allow for a holistic examination of traffic collisions, combining driver information, incident specifics, and non-motorist data, to foster a deeper understanding and enhance traffic safety analysis within the county.
#### Datasets to be Integrated:
1. **Crash Reporting - Drivers Data:** This dataset contains detailed information about the drivers involved in traffic collisions, including demographics, driving behavior, and vehicle information.
2. **Crash Reporting - Incidents Data:** The incidents dataset provides a broader perspective on each collision, encompassing data points such as the time, location, and conditions under which the incident occurred.
3. **Crash Reporting - Non-Motorists Data:** Information regarding pedestrians, cyclists, and any other non-motorist parties involved in traffic collisions is captured in this dataset. It is crucial for understanding the risks and outcomes for these vulnerable road users.
By amalgamating these datasets, we will create a more robust and interconnected data resource that will empower stakeholders to:
- Gain a 360-degree view of the factors contributing to traffic collisions.
- Identify high-risk areas and demographics that may benefit from targeted interventions.
- Drive data-informed policy decisions aimed at enhancing road safety.
- Facilitate easier access to data for public use, fostering transparency and community engagement.
The integration process will involve the following steps:
1. **Data Acquisition:** Securely obtain the most recent and historical data from the provided URLs and any other relevant sources.
2. **Data Cleaning and Standardization:** Ensure consistency across datasets by standardizing data formats, resolving discrepancies, and cleaning any inaccuracies or incomplete records.
3. **Data Integration:** Utilize key identifiers (such as report numbers or dates) to merge datasets into a single, cohesive structure while maintaining data integrity.
4. **Quality Assurance:** Conduct thorough testing to ensure the reliability of the integrated dataset.
Our approach promises to lay the groundwork for a data-driven strategy to reduce traffic collisions and enhance road safety in Montgomery County. We anticipate that this integrated dataset will not only serve immediate analytical needs but also establish a scalable framework for future data integration efforts.
### Analysis Goals
1. **Number of incidents over time:** This will plot a bar chart showing the number of incidents per year. This can help identify if there is an increasing or decreasing trend in traffic collisions.
2. **Correlation between weather conditions and number of accidents:** This will display a bar chart that shows the frequency of incidents under different weather conditions.
3. **Most dangerous roads:** This will identifythe top 10 roads with the most incidents and display them in a bar chart.
4. **Demographic analysis:** This will involve pie charts or bar charts showing the distribution of incidents among different demographic groups, such as age and gender.
5. **Time analysis:** This will include line graphs or heat maps to demonstrate the times of day or days of the week when collisions are most frequent.
6. **Type of collision and non-motorist involvement:** This will show pie charts or bar charts that break down the types of collisions and the extent to which non-motorists are involved.
### Conclusion
The proposed integration and analysis of Montgomery County's traffic collision datasets will provide a comprehensive understanding of the factors leading to traffic incidents and the impact they have on the community. By bringing together datasets that cover drivers, incidents, and non-motorists, we can achieve a multi-faceted view of road safety issues. This initiative will not only serve the immediate needs of traffic safety analysis but will also promote the development of more informed and effective traffic management strategies, potentially saving lives and reducing injuries on Montgomery County's roadways.
### Next Steps
1. **Stakeholder Engagement:** Collaborate with county officials, local law enforcement, and community organizations to align the project's objectives with public safety goals.
2. **Technical Development:** Assemble a team with expertise in data science and software engineering to handle the technical aspects of data integration and analysis.
3. **Public Outreach:** Develop a communication plan to inform the public about the initiative and the availability of the integrated dataset for community use.
By undertaking these steps, Montgomery County can continue to be a leader in using data to enhance road safety and protect its citizens.
# Dataset Card for Montgomery County Traffic Collisions
## Dataset Description
### General Information
- **Purpose**: This dataset is designed to provide comprehensive information on traffic collisions to facilitate analysis and policy-making for improved road safety in Montgomery County, Maryland.
- **Data Structure**: The dataset is structured with a collection of attributes that are critical for a detailed understanding of each collision event.
### Data Attributes
1. **Report Number**: A unique identifier for each collision report.
2. **Local Case Number**: Secondary identifier used by local agencies for tracking incidents.
3. **Agency Name**: The law enforcement agency that reported the collision.
4. **ACRS Report Type**: The type of report filed, categorizing the crash event.
5. **Crash Date/Time**: Timestamp of when the collision occurred.
6. **Route Type**: Classification of the road where the collision happened (e.g., Interstate, State Highway).
7. **Road Name**: Name of the road involved in the collision.
8. **Cross-Street Type**: Classification of the cross-street (if applicable).
9. **Cross-Street Name**: Name of the intersecting street.
10. **Off-Road Description**: Description for collisions that occurred off the main road.
11. **Municipality**: The city or town where the collision occurred.
12. **Related Non-Motorist**: Information on whether non-motorists were involved.
13. **Collision Type**: Describes the nature of the collision (e.g., head-on, rear-end).
14. **Weather**: Weather conditions at the time of the collision.
15. **Surface Condition**: Condition of the road surface (e.g., dry, wet, icy).
16. **Light**: Level of visibility based on lighting conditions.
17. **Traffic Control**: Indicates the presence and type of traffic control at the collision location.
18. **Driver Substance Abuse**: Information on whether substance abuse by the driver was a factor.
19. **Non-Motorist Substance Abuse**: Information on whether substance abuse by non-motorists was a factor.
20. **Person ID**: An identifier for individuals involved while maintaining privacy.
21. **Pedestrian Type**: Categorizes the type of non-motorist (e.g., pedestrian, cyclist).
22. **Pedestrian Movement**: Describes the movement of the pedestrian prior to collision.
23. **Pedestrian Actions**: Specific actions the pedestrian was engaged in.
24. **Pedestrian Location**: Specifies where the pedestrian was located (e.g., crosswalk, sidewalk).
25. **Pedestrian Obeyed Traffic Signal**: Indicates if the pedestrian followed traffic signals.
26. **Pedestrian Visibility**: Notes on the visibility of the pedestrian.
27. **At Fault**: Notes on which party was at fault in the collision.
28. **Injury Severity**: Details the severity of any injuries incurred.
29. **Safety Equipment**: Information on safety equipment used (e.g., seat belts, helmets).
30. **Latitude and Longitude**: Geographic coordinates of the collision.
31. **Location**: A textual representation of the location, usually an address.
### Dataset Example
```plaintext
Report Number: 123456789
Local Case Number: MCP123456
Agency Name: Montgomery County Police
ACRS Report Type: Fatal Crash
Crash Date/Time: 01/01/2024 17:45
Route Type: County Road
Road Name: Piney Branch Rd
Cross-Street Type: County Road
Cross-Street Name: Flower Ave
Off-Road Description: Near Intersection
Municipality: Silver Spring
Related Non-Motorist: Pedestrian
Collision Type: Pedestrian
Weather: Clear
Surface Condition: Dry
Light: Dusk
Traffic Control: Traffic Signal
Driver Substance Abuse: None Detected
Non-Motorist Substance Abuse: None Detected
Person ID: P1234567
Pedestrian Type: Adult
Pedestrian Movement: Crossing in Crosswalk
Pedestrian Actions: Walking
Pedestrian Location: In Crosswalk
Pedestrian Obeyed Traffic Signal: Yes
Pedestrian Visibility: High Visibility Clothing
At Fault: Driver
Injury Severity: Fatal Injury
Safety Equipment: None
Latitude and Longitude: 38.997564, -77.027755
Location: Piney Branch Rd & Flower Ave, Silver Spring, MD
``` |
CyberHarem/tosa_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tosa/土佐/土佐 (Azur Lane)
This is the dataset of tosa/土佐/土佐 (Azur Lane), containing 172 images and their tags.
The core tags of this character are `breasts, long_hair, animal_ears, large_breasts, mask_on_head, grey_hair, tail, sunglasses, fox_tail, eyewear_on_head, aviator_sunglasses, fox_ears, bangs, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 172 | 305.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tosa_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 172 | 149.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tosa_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 455 | 344.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tosa_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 172 | 256.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tosa_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 455 | 511.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tosa_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tosa_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, black_choker, fox_mask, looking_at_viewer, solo, two-tone_bikini, criss-cross_halter, day, highleg_bikini, navel, blue_sky, cloud, fluffy, outdoors, sitting, thigh_strap, cleavage, thighs, bare_shoulders, blush, braid, jewelry, ocean, very_long_hair |
| 1 | 22 |  |  |  |  |  | 1girl, black_choker, fox_mask, looking_at_viewer, solo, two-tone_bikini, highleg_bikini, criss-cross_halter, navel, fluffy, cleavage, white_background, simple_background, thigh_strap, blush |
| 2 | 7 |  |  |  |  |  | 1girl, blue_skirt, fox_mask, holding_sword, katana, solo, thigh_strap, wide_sleeves, bare_shoulders, black_gloves, detached_sleeves, looking_at_viewer, sideboob, antenna_hair, side_slit, standing, black_choker, sakuramon, sheath, sidelocks, simple_background, white_background, black_cape, closed_mouth, cowboy_shot, full_body, hakama, long_skirt, pleated_skirt, white_kimono |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_choker | fox_mask | looking_at_viewer | solo | two-tone_bikini | criss-cross_halter | day | highleg_bikini | navel | blue_sky | cloud | fluffy | outdoors | sitting | thigh_strap | cleavage | thighs | bare_shoulders | blush | braid | jewelry | ocean | very_long_hair | white_background | simple_background | blue_skirt | holding_sword | katana | wide_sleeves | black_gloves | detached_sleeves | sideboob | antenna_hair | side_slit | standing | sakuramon | sheath | sidelocks | black_cape | closed_mouth | cowboy_shot | full_body | hakama | long_skirt | pleated_skirt | white_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:--------------------|:-------|:------------------|:---------------------|:------|:-----------------|:--------|:-----------|:--------|:---------|:-----------|:----------|:--------------|:-----------|:---------|:-----------------|:--------|:--------|:----------|:--------|:-----------------|:-------------------|:--------------------|:-------------|:----------------|:---------|:---------------|:---------------|:-------------------|:-----------|:---------------|:------------|:-----------|:------------|:---------|:------------|:-------------|:---------------|:--------------|:------------|:---------|:-------------|:----------------|:---------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | | X | | | X | X | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
tyzhu/ds2_try_lora_merge | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 1044.247619047619
num_examples: 10
- name: validation
num_bytes: 1044.247619047619
num_examples: 10
download_size: 4650
dataset_size: 2088.495238095238
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "ds2_try_lora_merge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AIMH-DHgroup/llama-2-7b-chat-events | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024 | ---
pretty_name: Evaluation run of RaoFoundation/774M-03_09_2024
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RaoFoundation/774M-03_09_2024](https://huggingface.co/RaoFoundation/774M-03_09_2024)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T07:31:11.420594](https://huggingface.co/datasets/open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024/blob/main/results_2024-03-10T07-31-11.420594.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25714164108824067,\n\
\ \"acc_stderr\": 0.03086305887436439,\n \"acc_norm\": 0.2589960153068607,\n\
\ \"acc_norm_stderr\": 0.03165311681557453,\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059686,\n \"mc2\": 0.3444347337952659,\n\
\ \"mc2_stderr\": 0.013606216674916146\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2790102389078498,\n \"acc_stderr\": 0.013106784883601345,\n\
\ \"acc_norm\": 0.302901023890785,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41366261700856405,\n\
\ \"acc_stderr\": 0.00491482938498347,\n \"acc_norm\": 0.5388368850826528,\n\
\ \"acc_norm_stderr\": 0.0049747064284342765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123394,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123394\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.026055296901152915,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.026055296901152915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749912,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749912\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628834,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.0339549002085611,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.0339549002085611\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332204,\n \"\
acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332204\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n \"\
acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752943,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.025649470265889183,\n\
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.025649470265889183\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.017923087667803053,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.017923087667803053\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355168,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355168\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3080168776371308,\n \"acc_stderr\": 0.030052389335605695,\n \
\ \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.030052389335605695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.03170882426845501,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.03170882426845501\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.02891120880274949,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.02891120880274949\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27458492975734355,\n\
\ \"acc_stderr\": 0.015959829933084046,\n \"acc_norm\": 0.27458492975734355,\n\
\ \"acc_norm_stderr\": 0.015959829933084046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808842,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808842\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958167,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958167\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\
\ \"acc_stderr\": 0.02492672322484553,\n \"acc_norm\": 0.2604501607717042,\n\
\ \"acc_norm_stderr\": 0.02492672322484553\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953778,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953778\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034517,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.02888819310398864,\n\
\ \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.02888819310398864\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209196,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059686,\n \"mc2\": 0.3444347337952659,\n\
\ \"mc2_stderr\": 0.013606216674916146\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5509076558800315,\n \"acc_stderr\": 0.01397945938914085\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245486\n }\n}\n```"
repo_url: https://huggingface.co/RaoFoundation/774M-03_09_2024
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|winogrande|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|winogrande|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T07-31-11.420594.parquet'
- config_name: results
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- results_2024-03-10T07-11-12.882374.parquet
- split: 2024_03_10T07_31_11.420594
path:
- results_2024-03-10T07-31-11.420594.parquet
- split: latest
path:
- results_2024-03-10T07-31-11.420594.parquet
---
# Dataset Card for Evaluation run of RaoFoundation/774M-03_09_2024
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaoFoundation/774M-03_09_2024](https://huggingface.co/RaoFoundation/774M-03_09_2024) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T07:31:11.420594](https://huggingface.co/datasets/open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024/blob/main/results_2024-03-10T07-31-11.420594.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25714164108824067,
"acc_stderr": 0.03086305887436439,
"acc_norm": 0.2589960153068607,
"acc_norm_stderr": 0.03165311681557453,
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059686,
"mc2": 0.3444347337952659,
"mc2_stderr": 0.013606216674916146
},
"harness|arc:challenge|25": {
"acc": 0.2790102389078498,
"acc_stderr": 0.013106784883601345,
"acc_norm": 0.302901023890785,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.41366261700856405,
"acc_stderr": 0.00491482938498347,
"acc_norm": 0.5388368850826528,
"acc_norm_stderr": 0.0049747064284342765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123394,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123394
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.026055296901152915,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.026055296901152915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749912,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749912
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628834,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.0339549002085611,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.0339549002085611
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332204,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332204
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.2,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752943,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.017923087667803053,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.017923087667803053
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.026491914727355168,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.026491914727355168
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3080168776371308,
"acc_stderr": 0.030052389335605695,
"acc_norm": 0.3080168776371308,
"acc_norm_stderr": 0.030052389335605695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.03170882426845501,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.03170882426845501
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274949,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274949
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27458492975734355,
"acc_stderr": 0.015959829933084046,
"acc_norm": 0.27458492975734355,
"acc_norm_stderr": 0.015959829933084046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808842,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808842
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958167,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958167
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2604501607717042,
"acc_stderr": 0.02492672322484553,
"acc_norm": 0.2604501607717042,
"acc_norm_stderr": 0.02492672322484553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953778,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953778
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034517,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.02888819310398864,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.02888819310398864
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209196,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059686,
"mc2": 0.3444347337952659,
"mc2_stderr": 0.013606216674916146
},
"harness|winogrande|5": {
"acc": 0.5509076558800315,
"acc_stderr": 0.01397945938914085
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245486
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
logasja/mit-adobe-fivek | ---
dataset_info:
- config_name: a
features:
- name: original
dtype: image
- name: augmented
dtype: image
- name: location
dtype:
class_label:
names:
'0': outdoor
'1': indoor
'2': unknown
- name: time
dtype:
class_label:
names:
'0': day
'1': unknown
'2': dusk
'3': night
- name: light
dtype:
class_label:
names:
'0': sun_sky
'1': artificial
'2': unknown
'3': mixed
- name: subject
dtype:
class_label:
names:
'0': people
'1': man_made
'2': nature
'3': unknown
'4': animals
'5': abstract
- name: license
dtype:
class_label:
names:
'0': Adobe
'1': AdobeMIT
splits:
- name: train
num_bytes: 83516576303
num_examples: 3500
- name: test
num_bytes: 24332706376
num_examples: 1000
- name: validation
num_bytes: 11930052394
num_examples: 500
download_size: 119291008509
dataset_size: 119779335073
- config_name: b
features:
- name: original
dtype: image
- name: augmented
dtype: image
- name: location
dtype:
class_label:
names:
'0': outdoor
'1': indoor
'2': unknown
- name: time
dtype:
class_label:
names:
'0': day
'1': unknown
'2': dusk
'3': night
- name: light
dtype:
class_label:
names:
'0': sun_sky
'1': artificial
'2': unknown
'3': mixed
- name: subject
dtype:
class_label:
names:
'0': people
'1': man_made
'2': nature
'3': unknown
'4': animals
'5': abstract
- name: license
dtype:
class_label:
names:
'0': Adobe
'1': AdobeMIT
splits:
- name: train
num_bytes: 83258395373
num_examples: 3500
- name: test
num_bytes: 24212041008
num_examples: 1000
- name: validation
num_bytes: 11959397496
num_examples: 500
download_size: 118927071665
dataset_size: 119429833877
- config_name: c
features:
- name: original
dtype: image
- name: augmented
dtype: image
- name: location
dtype:
class_label:
names:
'0': outdoor
'1': indoor
'2': unknown
- name: time
dtype:
class_label:
names:
'0': day
'1': unknown
'2': dusk
'3': night
- name: light
dtype:
class_label:
names:
'0': sun_sky
'1': artificial
'2': unknown
'3': mixed
- name: subject
dtype:
class_label:
names:
'0': people
'1': man_made
'2': nature
'3': unknown
'4': animals
'5': abstract
- name: license
dtype:
class_label:
names:
'0': Adobe
'1': AdobeMIT
splits:
- name: train
num_bytes: 86634482129
num_examples: 3500
- name: test
num_bytes: 25274791938
num_examples: 1000
- name: validation
num_bytes: 12458944828
num_examples: 500
download_size: 123806916993
dataset_size: 124368218895
- config_name: d
features:
- name: original
dtype: image
- name: augmented
dtype: image
- name: location
dtype:
class_label:
names:
'0': outdoor
'1': indoor
'2': unknown
- name: time
dtype:
class_label:
names:
'0': day
'1': unknown
'2': dusk
'3': night
- name: light
dtype:
class_label:
names:
'0': sun_sky
'1': artificial
'2': unknown
'3': mixed
- name: subject
dtype:
class_label:
names:
'0': people
'1': man_made
'2': nature
'3': unknown
'4': animals
'5': abstract
- name: license
dtype:
class_label:
names:
'0': Adobe
'1': AdobeMIT
splits:
- name: train
num_bytes: 84743866913
num_examples: 3500
- name: test
num_bytes: 24642491298
num_examples: 1000
- name: validation
num_bytes: 12117343580
num_examples: 500
download_size: 120899071301
dataset_size: 121503701791
- config_name: e
features:
- name: original
dtype: image
- name: augmented
dtype: image
- name: location
dtype:
class_label:
names:
'0': outdoor
'1': indoor
'2': unknown
- name: time
dtype:
class_label:
names:
'0': day
'1': unknown
'2': dusk
'3': night
- name: light
dtype:
class_label:
names:
'0': sun_sky
'1': artificial
'2': unknown
'3': mixed
- name: subject
dtype:
class_label:
names:
'0': people
'1': man_made
'2': nature
'3': unknown
'4': animals
'5': abstract
- name: license
dtype:
class_label:
names:
'0': Adobe
'1': AdobeMIT
splits:
- name: train
num_bytes: 87195145386
num_examples: 3500
- name: test
num_bytes: 25341223232
num_examples: 1000
- name: validation
num_bytes: 12475902082
num_examples: 500
download_size: 124281756534
dataset_size: 125012270700
configs:
- config_name: a
data_files:
- split: train
path: a/train-*
- split: test
path: a/test-*
- split: validation
path: a/validation-*
- config_name: b
data_files:
- split: train
path: b/train-*
- split: test
path: b/test-*
- split: validation
path: b/validation-*
- config_name: c
data_files:
- split: train
path: c/train-*
- split: test
path: c/test-*
- split: validation
path: c/validation-*
- config_name: d
data_files:
- split: train
path: d/train-*
- split: test
path: d/test-*
- split: validation
path: d/validation-*
- config_name: e
data_files:
- split: train
path: e/train-*
- split: test
path: e/test-*
- split: validation
path: e/validation-*
task_categories:
- image-to-image
- feature-extraction
language:
- en
annotations_creators:
- expert-generated
license: other # Example: apache-2.0 or any license from https://hf.co/docs/hub/repositories-licenses
license_name: adobe-mit # If license = other (license not in https://hf.co/docs/hub/repositories-licenses), specify an id for it here, like `my-license-1.0`.
license_link: LICENSE.md
license_details: A custom license developed for this dataset by Adobe and MIT. # Legacy, textual description of a custom license.
tags:
- adobe
- aesthetic
pretty_name: MIT Adobe FiveK
size_categories:
- 1K<n<10K
paperswithcode_id: mit-adobe-fivek
---
# Adobe FiveK
<!-- Provide a quick summary of the dataset. -->
This is an upload of the Adobe FiveK dataset.
Note that I am not one of the authors of this dataset, if one of the authors would like to take ownership of this repository please reach out to me.
The data provided is not in the original format either.
Due to the massive size of the dataset >1TB I elected to convert all .tif and .dng files to a standard .webp with lossless compression.
Please refer to the dataset homepage for access to the uncompressed versions of the data.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
We collected 5,000 photographs taken with SLR cameras by a set of different photographers.
They are all in RAW format; that is, all the information recorded by the camera sensor is preserved.
We made sure that these photographs cover a broad range of scenes, subjects, and lighting conditions.
We then hired five photography students in an art school to adjust the tone of the photos.
Each of them retouched all the 5,000 photos using a software dedicated to photo adjustment (Adobe Lightroom) on which they were extensively trained.
We asked the retouchers to achieve visually pleasing renditions, akin to a postcard. The retouchers were compensated for their work.
This dataset was collected for our project on learning photographic adjustments.
- **Acknowledgements:**
We are grateful to Katrin Eismann and Jeff Schewe for providing invaluable advice and for introducing us to the community of professional photographers.
We thank Todd Carroll, David Mager, Jaime Permuth, LaNola Katheleen Stone, and Damian Wampler for their incredible patience while retouching thousands of photos.
Special thanks to everyone who contributed their photos to this dataset: without you this work would not have been possible.
- **Funded by:** Foxconn and NSF (0964004) and a gift from Adobe
- **License:**
You can use these photos for research under the terms of the following licenses:
1. License [LicenseAdobe.txt](https://data.csail.mit.edu/graphics/fivek/legal/LicenseAdobe.txt) covers files listed in [filesAdobe.txt](https://data.csail.mit.edu/graphics/fivek/legal/filesAdobe.txt).
2. License [LicenseAdobeMIT.txt](https://data.csail.mit.edu/graphics/fivek/legal/LicenseAdobeMIT.txt) covers files listed in [filesAdobeMIT.txt](https://data.csail.mit.edu/graphics/fivek/legal/filesAdobeMIT.txt).
Each photo is labled with the license it is under.
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://data.csail.mit.edu/graphics/fivek/
- **Paper:** http://people.csail.mit.edu/vladb/photoadjust/
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@inproceedings{fivek,
author = "Vladimir Bychkovsky and Sylvain Paris and Eric Chan and Fr{\'e}do Durand",
title = "Learning Photographic Global Tonal Adjustment with a Database of Input / Output Image Pairs",
booktitle = "The Twenty-Fourth IEEE Conference on Computer Vision and Pattern Recognition",
year = "2011"
}
## Dataset Card Authors [optional]
@logasja
## Dataset Card Contact
@logasja |
ronitHF/pubmed-10k | ---
task_categories:
- summarization
pretty_name: PubMed 10k
size_categories:
- 1K<n<10K
---
### Dataset Summary
First 10k rows of the scientific_papers["pubmed"] dataset. 10:1:1 split.
### Usage
```
from datasets import load_dataset
train_dataset = load_dataset("ronitHF/pubmed-10k", split="train")
val_dataset = load_dataset("ronitHF/pubmed-10k", split="validation")
test_dataset = load_dataset("ronitHF/pubmed-10k", split="test")
```
|
MaryamAlAli/Mixat_test | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 3792542644.068612
num_examples: 1587
download_size: 3216571999
dataset_size: 3792542644.068612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Mixat_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibm/vira-dialog-acts-live | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 23507
num_examples: 571
- name: validation
num_bytes: 3165
num_examples: 71
- name: test
num_bytes: 2591
num_examples: 72
download_size: 0
dataset_size: 29263
---
# Dataset Card for "vira-dialog-acts-live"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polinaeterna/test_push_two_configs | ---
dataset_info:
- config_name: v1
features:
- name: x
dtype: int64
- name: y
dtype: string
splits:
- name: train
num_bytes: 46
num_examples: 3
- name: test
num_bytes: 32
num_examples: 2
download_size: 1674
dataset_size: 78
- config_name: v2
features:
- name: x
dtype: int64
- name: y
dtype: string
splits:
- name: train
num_bytes: 60
num_examples: 4
- name: test
num_bytes: 18
num_examples: 1
download_size: 1671
dataset_size: 78
---
# Dataset Card for "test_push_two_configs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/train_free_7 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604928936
num_examples: 10000
download_size: 1750516119
dataset_size: 9604928936
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ThWu/filtered_nectar_2_openai_format | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: model
dtype: string
- name: rank
dtype: float64
- name: turns
dtype: int64
- name: num_responses
dtype: int64
- name: source
sequence: string
- name: good_natured
dtype: bool
- name: openai_format_answers
list:
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 3035941399
num_examples: 182444
download_size: 1066151966
dataset_size: 3035941399
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mncai/ko-chatbot-arena | ---
license: apache-2.0
---
|
ZhangShenao/0.00045_idpo_noreplacerej_decalpha_ref_response | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
splits:
- name: train_prefs_1
num_bytes: 164111773
num_examples: 20378
- name: test_prefs_1
num_bytes: 16019213
num_examples: 2000
download_size: 99390696
dataset_size: 180130986
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
---
# Dataset Card for "0.00045_idpo_noreplacerej_decalpha_ref_response"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hpi-dhc/evidence-inference-simple | ---
dataset_info:
features:
- name: pmcid
dtype: int32
- name: pmid
dtype: int32
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': no significant effect
'1': significant effect
splits:
- name: train
num_bytes: 1930106
num_examples: 1028
- name: validation
num_bytes: 229838
num_examples: 118
- name: test
num_bytes: 230635
num_examples: 123
download_size: 0
dataset_size: 2390579
---
# Dataset Card for "ei-abstract-significance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fhasan85/bengali-prompts | ---
license: openrail
language:
- bn
---
# Dataset for evaluating language model using real world Bengali data
This dataset contains the prompts and questions people (Mostly from Bangladesh) asked on [Alapchari](http://chatrik.org/alapchari) from 25 February 2023 to 4 June 2023.
It provides 35218 unique prompts for those interested in the development and evaluation of Bangla language models, offering an unique opportunity for evaluating their language models with real-world data collected from Bangladesh.
Currently it's sorted from the smallest strings to the largest. Most of the good questions are in the middle. |
collabora/librilight-webdataset | ---
license: cc0-1.0
---
|
Myca/med_ | ---
license: cc-by-nc-3.0
---
|
minerba/orion_data | ---
license: apache-2.0
---
|
Markjr/minecraftGameplay | ---
license: cc-by-4.0
---
|
raicrits/Orca_ITA_200k | ---
license: other
---
# OpenOrca ITA 200k
Google Translate Italian translations of 200k random entries of the dataset [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca). All the entries are selected randomly, in particular
100k from the ones generated with gpt-3.5-turbo and the other 100k from the ones generated with gpt-4 (visible in the "gpt_version" column of this dataset). The ids are the ones present in the orginial dataset. |
CyberHarem/gagaga_girl_yugioh | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gagaga girl/ガガガガール (Yu-Gi-Oh! Zexal)
This is the dataset of gagaga girl/ガガガガール (Yu-Gi-Oh! Zexal), containing 152 images and their tags.
The core tags of this character are `blonde_hair, hat, wizard_hat, breasts, red_eyes, long_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 152 | 178.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gagaga_girl_yugioh/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 152 | 109.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gagaga_girl_yugioh/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 319 | 214.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gagaga_girl_yugioh/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 152 | 161.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gagaga_girl_yugioh/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 319 | 298.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gagaga_girl_yugioh/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gagaga_girl_yugioh',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, detached_sleeves, duel_monster, solo, bare_shoulders, skull, smile, boots, chain, cellphone_charm |
| 1 | 10 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, black_headwear, detached_sleeves, duel_monster, hair_between_eyes, solo, looking_at_viewer, blush, closed_mouth, necklace, medium_hair, upper_body, bangs, smile, sleeveless_dress, taut_clothes, black_sleeves, white_background |
| 2 | 9 |  |  |  |  |  | 1boy, 1girl, bare_shoulders, blush, duel_monster, hetero, huge_breasts, nipples, paizuri, solo_focus, cum_on_breasts, detached_sleeves, penis, bar_censor, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | duel_monster | solo | bare_shoulders | skull | smile | boots | chain | cellphone_charm | black_dress | black_headwear | hair_between_eyes | looking_at_viewer | blush | closed_mouth | necklace | medium_hair | upper_body | bangs | sleeveless_dress | taut_clothes | black_sleeves | white_background | 1boy | hetero | huge_breasts | nipples | paizuri | solo_focus | cum_on_breasts | penis | bar_censor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:---------------|:-------|:-----------------|:--------|:--------|:--------|:--------|:------------------|:--------------|:-----------------|:--------------------|:--------------------|:--------|:---------------|:-----------|:--------------|:-------------|:--------|:-------------------|:---------------|:----------------|:-------------------|:-------|:---------|:---------------|:----------|:----------|:-------------|:-----------------|:--------|:-------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
XiaoY1/CMMMU | ---
dataset_info:
config_name: technology_and_engineering
features:
- name: id
dtype: string
- name: type
dtype: string
- name: source_type
dtype: string
- name: source
dtype: string
- name: question
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: option3
dtype: string
- name: option4
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: answer
dtype: string
- name: analysis
dtype: string
- name: distribution
dtype: string
- name: difficulty_level
dtype: string
- name: subcategory
dtype: string
- name: category
dtype: string
- name: subfield
dtype: string
- name: img_type
dtype: string
- name: image_1_filename
dtype: string
- name: image_2_filename
dtype: string
- name: image_3_filename
dtype: string
- name: image_4_filename
dtype: string
- name: image_5_filename
dtype: string
splits:
- name: dev
num_bytes: 13180933.0
num_examples: 112
- name: val
num_bytes: 95827659.0
num_examples: 900
- name: test
num_bytes: 3146076690.0
num_examples: 11000
download_size: 1297432627
dataset_size: 3255085282.0
configs:
- config_name: technology_and_engineering
data_files:
- split: dev
path: technology_and_engineering/dev-*
- split: val
path: technology_and_engineering/val-*
- split: test
path: technology_and_engineering/test-*
---
|
open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5-Instruct | ---
pretty_name: Evaluation run of Locutusque/TinyMistral-248M-v2.5-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/TinyMistral-248M-v2.5-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T01:45:07.837106](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5-Instruct/blob/main/results_2024-01-27T01-45-07.837106.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23908148309733446,\n\
\ \"acc_stderr\": 0.030234054596903193,\n \"acc_norm\": 0.2393250264225143,\n\
\ \"acc_norm_stderr\": 0.031024873198164184,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662587,\n \"mc2\": 0.4420811324629599,\n\
\ \"mc2_stderr\": 0.015284325356180175\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n\
\ \"acc_norm\": 0.2226962457337884,\n \"acc_norm_stderr\": 0.012158314774829931\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2669786895040829,\n\
\ \"acc_stderr\": 0.004414770331224643,\n \"acc_norm\": 0.27604062935670187,\n\
\ \"acc_norm_stderr\": 0.004461235175488311\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\"\
: 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\"\
: 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \
\ \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.033911609343436025,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.033911609343436025\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.026480357179895702,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.026480357179895702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.039505818611799616,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.039505818611799616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281337,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281337\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.03619604524124251,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.03619604524124251\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.023540799358723278,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.023540799358723278\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n\
\ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20917431192660552,\n \"acc_stderr\": 0.017437937173343226,\n \"\
acc_norm\": 0.20917431192660552,\n \"acc_norm_stderr\": 0.017437937173343226\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605617,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794089,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794089\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914418,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914418\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2720306513409962,\n\
\ \"acc_stderr\": 0.015913367447500527,\n \"acc_norm\": 0.2720306513409962,\n\
\ \"acc_norm_stderr\": 0.015913367447500527\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098431,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098431\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20915032679738563,\n \"acc_stderr\": 0.023287685312334806,\n\
\ \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.023287685312334806\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.02282731749105968,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.02282731749105968\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.022779719088733396,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.022779719088733396\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23468057366362452,\n\
\ \"acc_stderr\": 0.010824026872449322,\n \"acc_norm\": 0.23468057366362452,\n\
\ \"acc_norm_stderr\": 0.010824026872449322\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.024723110407677055,\n\
\ \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.024723110407677055\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.0178831881346672,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.0178831881346672\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225374,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225374\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.028996909693328934,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.028996909693328934\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18072289156626506,\n\
\ \"acc_stderr\": 0.029955737855810138,\n \"acc_norm\": 0.18072289156626506,\n\
\ \"acc_norm_stderr\": 0.029955737855810138\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662587,\n \"mc2\": 0.4420811324629599,\n\
\ \"mc2_stderr\": 0.015284325356180175\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.48224151539068666,\n \"acc_stderr\": 0.014043619596174966\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/TinyMistral-248M-v2.5-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|arc:challenge|25_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|gsm8k|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hellaswag|10_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T01-45-07.837106.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T01-45-07.837106.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- '**/details_harness|winogrande|5_2024-01-27T01-45-07.837106.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T01-45-07.837106.parquet'
- config_name: results
data_files:
- split: 2024_01_27T01_45_07.837106
path:
- results_2024-01-27T01-45-07.837106.parquet
- split: latest
path:
- results_2024-01-27T01-45-07.837106.parquet
---
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2.5-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-v2.5-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T01:45:07.837106](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5-Instruct/blob/main/results_2024-01-27T01-45-07.837106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23908148309733446,
"acc_stderr": 0.030234054596903193,
"acc_norm": 0.2393250264225143,
"acc_norm_stderr": 0.031024873198164184,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662587,
"mc2": 0.4420811324629599,
"mc2_stderr": 0.015284325356180175
},
"harness|arc:challenge|25": {
"acc": 0.21331058020477817,
"acc_stderr": 0.011970971742326334,
"acc_norm": 0.2226962457337884,
"acc_norm_stderr": 0.012158314774829931
},
"harness|hellaswag|10": {
"acc": 0.2669786895040829,
"acc_stderr": 0.004414770331224643,
"acc_norm": 0.27604062935670187,
"acc_norm_stderr": 0.004461235175488311
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.026480357179895702,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.026480357179895702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.039505818611799616,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.039505818611799616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281337,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281337
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124251,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124251
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.023540799358723278,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.023540799358723278
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654824,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654824
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20917431192660552,
"acc_stderr": 0.017437937173343226,
"acc_norm": 0.20917431192660552,
"acc_norm_stderr": 0.017437937173343226
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605617,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914418,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914418
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2720306513409962,
"acc_stderr": 0.015913367447500527,
"acc_norm": 0.2720306513409962,
"acc_norm_stderr": 0.015913367447500527
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098431,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098431
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.02282731749105968,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.02282731749105968
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.022779719088733396,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.022779719088733396
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23468057366362452,
"acc_stderr": 0.010824026872449322,
"acc_norm": 0.23468057366362452,
"acc_norm_stderr": 0.010824026872449322
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.024723110407677055,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.024723110407677055
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.0178831881346672,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.0178831881346672
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225374,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225374
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328934,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328934
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18072289156626506,
"acc_stderr": 0.029955737855810138,
"acc_norm": 0.18072289156626506,
"acc_norm_stderr": 0.029955737855810138
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662587,
"mc2": 0.4420811324629599,
"mc2_stderr": 0.015284325356180175
},
"harness|winogrande|5": {
"acc": 0.48224151539068666,
"acc_stderr": 0.014043619596174966
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Cheetor1996/Lilia_Milcrabe | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
**Lilia Milcrabe** from **Viper F-40**
- *Trained with anime (full-final-pruned) model.*
- *3 versions; 5, 8, and 10 epochs.*
- *Recommended LoRA weigh blocks: MIDD, OUTD, and OUTALL. (ALL is a bit messy, but you can still use it under your own risk.)*
- *Works best with 0.7+ weights, but use 0.8-1.0 weights to get the character as accurate as possible, specially if using OUTD and OUTALL LoRA weight blocks.*
- *Recommended weighting the activation tag lilia milcrabe (preferably with 1:1 or 1:2) if you didn't get the character right first.* |
open-llm-leaderboard/details_adonlee__LLaMA_2_70B_LoRA | ---
pretty_name: Evaluation run of adonlee/LLaMA_2_70B_LoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adonlee/LLaMA_2_70B_LoRA](https://huggingface.co/adonlee/LLaMA_2_70B_LoRA) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adonlee__LLaMA_2_70B_LoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T21:35:51.410251](https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__LLaMA_2_70B_LoRA/blob/main/results_2023-09-22T21-35-51.410251.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7077096775676626,\n\
\ \"acc_stderr\": 0.030867670314758275,\n \"acc_norm\": 0.7114995822621553,\n\
\ \"acc_norm_stderr\": 0.030836833292351554,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6451679386365279,\n\
\ \"mc2_stderr\": 0.014753028795637621\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238361,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635743\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6886078470424218,\n\
\ \"acc_stderr\": 0.004621163476949205,\n \"acc_norm\": 0.8755228042222665,\n\
\ \"acc_norm_stderr\": 0.003294504807555228\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.02713429162874171,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.02713429162874171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7106382978723405,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.7106382978723405,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916746,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916746\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47619047619047616,\n \"acc_stderr\": 0.02572209706438853,\n \"\
acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.02572209706438853\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"\
acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5714285714285714,\n \"acc_stderr\": 0.034819048444388045,\n \"\
acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.021469735576055343,\n \"\
acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055343\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295893,\n \
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295893\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\"\
: 0.9019607843137255,\n \"acc_norm_stderr\": 0.0208711184555521\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8818565400843882,\n \"acc_stderr\": 0.02101105265987847,\n \"\
acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.02101105265987847\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\
\ \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n\
\ \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.0283116014414386,\n\
\ \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.0283116014414386\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.012036729568216054,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.012036729568216054\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.646927374301676,\n\
\ \"acc_stderr\": 0.01598420454526858,\n \"acc_norm\": 0.646927374301676,\n\
\ \"acc_norm_stderr\": 0.01598420454526858\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.02103851777015737,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.02103851777015737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.599290780141844,\n \"acc_stderr\": 0.029233465745573096,\n \
\ \"acc_norm\": 0.599290780141844,\n \"acc_norm_stderr\": 0.029233465745573096\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5814863102998696,\n\
\ \"acc_stderr\": 0.012599505608336482,\n \"acc_norm\": 0.5814863102998696,\n\
\ \"acc_norm_stderr\": 0.012599505608336482\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377204,\n\
\ \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377204\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7679738562091504,\n \"acc_stderr\": 0.017077373377856933,\n \
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.017077373377856933\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.025206963154225395,\n\
\ \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.025206963154225395\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166323,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6451679386365279,\n\
\ \"mc2_stderr\": 0.014753028795637621\n }\n}\n```"
repo_url: https://huggingface.co/adonlee/LLaMA_2_70B_LoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|arc:challenge|25_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hellaswag|10_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T21-35-51.410251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T21-35-51.410251.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T21-35-51.410251.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T21-35-51.410251.parquet'
- config_name: results
data_files:
- split: 2023_09_22T21_35_51.410251
path:
- results_2023-09-22T21-35-51.410251.parquet
- split: latest
path:
- results_2023-09-22T21-35-51.410251.parquet
---
# Dataset Card for Evaluation run of adonlee/LLaMA_2_70B_LoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/adonlee/LLaMA_2_70B_LoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [adonlee/LLaMA_2_70B_LoRA](https://huggingface.co/adonlee/LLaMA_2_70B_LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adonlee__LLaMA_2_70B_LoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T21:35:51.410251](https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__LLaMA_2_70B_LoRA/blob/main/results_2023-09-22T21-35-51.410251.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7077096775676626,
"acc_stderr": 0.030867670314758275,
"acc_norm": 0.7114995822621553,
"acc_norm_stderr": 0.030836833292351554,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6451679386365279,
"mc2_stderr": 0.014753028795637621
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.013512058415238361,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635743
},
"harness|hellaswag|10": {
"acc": 0.6886078470424218,
"acc_stderr": 0.004621163476949205,
"acc_norm": 0.8755228042222665,
"acc_norm_stderr": 0.003294504807555228
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7106382978723405,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.7106382978723405,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916746,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916746
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.02572209706438853,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.02572209706438853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.021469735576055343,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.021469735576055343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295893,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295893
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.02101105265987847,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.02101105265987847
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.0283116014414386,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.0283116014414386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216054,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.646927374301676,
"acc_stderr": 0.01598420454526858,
"acc_norm": 0.646927374301676,
"acc_norm_stderr": 0.01598420454526858
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.02103851777015737,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.02103851777015737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.599290780141844,
"acc_stderr": 0.029233465745573096,
"acc_norm": 0.599290780141844,
"acc_norm_stderr": 0.029233465745573096
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5814863102998696,
"acc_stderr": 0.012599505608336482,
"acc_norm": 0.5814863102998696,
"acc_norm_stderr": 0.012599505608336482
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377204,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377204
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.017077373377856933,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.017077373377856933
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166323,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6451679386365279,
"mc2_stderr": 0.014753028795637621
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rajistics/auditor_review | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- sentiment-classification
paperswithcode_id: null
pretty_name: Auditor_Review
---
# Dataset Card for financial_phrasebank
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
Auditor review data collected by News Department
- **Point of Contact:**
Talked to COE for Auditing
### Dataset Summary
Auditor sentiment dataset of sentences from financial news. The dataset consists of *** sentences from English language financial news categorized by sentiment. The dataset is divided by agreement rate of 5-8 annotators.
### Supported Tasks and Leaderboards
Sentiment Classification
### Languages
English
## Dataset Structure
### Data Instances
```
{ "sentence": "Pharmaceuticals group Orion Corp reported a fall in its third-quarter earnings that were hit by larger expenditures on R&D and marketing .",
"label": "negative"
}
```
### Data Fields
- sentence: a tokenized line from the dataset
- label: a label corresponding to the class as a string: 'positive', 'negative' or 'neutral'
### Data Splits
A test train split was created randomly with a 75/25 split
## Dataset Creation
### Curation Rationale
The key arguments for the low utilization of statistical techniques in
financial sentiment analysis have been the difficulty of implementation for
practical applications and the lack of high quality training data for building
such models. ***
### Source Data
#### Initial Data Collection and Normalization
The corpus used in this paper is made out of English news on all listed
companies in ****
#### Who are the source language producers?
The source data was written by various auditors
### Annotations
#### Annotation process
This release of the financial phrase bank covers a collection of 4840
sentences. The selected collection of phrases was annotated by 16 people with
adequate background knowledge on financial markets.
Given the large number of overlapping annotations (5 to 8 annotations per
sentence), there are several ways to define a majority vote based gold
standard. To provide an objective comparison, we have formed 4 alternative
reference datasets based on the strength of majority agreement:
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
All annotators were from the same institution and so interannotator agreement
should be understood with this taken into account.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
License: Creative Commons Attribution 4.0 International License (CC-BY)
### Contributions
|
manu/bnf_clean | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: author
dtype: string
- name: title
dtype: string
- name: mean_nqa
dtype: float64
- name: date
dtype: string
- name: subject
dtype: string
- name: rights
dtype: string
- name: original_folder
dtype: string
- name: perplexity
dtype: float64
splits:
- name: '2023'
num_bytes: 129088433.72207084
num_examples: 441
- name: '2021_1'
num_bytes: 96451.66666666667
num_examples: 5
- name: '2021_2'
num_bytes: 85416.8
num_examples: 4
download_size: 77863123
dataset_size: 129270302.18873751
---
# Dataset Card for "bnf_clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adi-kmt/Parlar | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
Parlar is a Catlan word that means 'to talk'.
This dataset contains some conversations considering multiple topics of daily life like basic electrical, medical, science and some philosophy.
Have also tried to generate different styles, attitudes and roles.
GPT-4 Credits graciously donated by [Harsh Gupta](https://twitter.com/hargup13)
## Caution
This dataset was generated, please note that some content may not be entirely precise or reflect expert consensus. Users are encouraged to verify information independently for scholarly or critical purposes. |
sergiolucero/5medwords_chile | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 20757.0
num_examples: 5
download_size: 29250
dataset_size: 20757.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huangyt/FINETUNE3_TEST2 | ---
license: openrail
---
|
m-a-p/Code-Feedback | ---
language:
- en
pipeline_tag: text-generation
tags:
- code
license: apache-2.0
task_categories:
- question-answering
size_categories:
- 10K<n<100K
---
<h1 align="center"> OpenCodeInterpreter: Integrating Code Generation with Execution and Refinement<h1>
<p align="center">
<img width="1000px" alt="OpenCodeInterpreter" src="https://opencodeinterpreter.github.io/static/images/figure1.png">
</p>
<p align="center">
<a href="https://opencodeinterpreter.github.io/">[🏠Homepage]</a>
|
<a href="https://github.com/OpenCodeInterpreter/OpenCodeInterpreter/">[🛠️Code]</a>
</p>
<hr>
## Introduction
OpenCodeInterpreter is a family of open-source code generation systems designed to bridge the gap between large language models and advanced proprietary systems like the GPT-4 Code Interpreter. It significantly advances code generation capabilities by integrating execution and iterative refinement functionalities.
For further information and related work, refer to our paper: ["OpenCodeInterpreter: A System for Enhanced Code Generation and Execution"](https://arxiv.org/abs/2402.14658) available on arXiv.
## Contact
If you have any inquiries, please feel free to raise an issue or reach out to us via email at: xiangyue.work@gmail.com, zhengtianyu0428@gmail.com.
We're here to assist you!
⚠️The dataset contains part data generated by GPT-4-0613 and GPT-3.5-turbo-0613, developed by OpenAI. Please pay attention to OpenAI's usage policy when adopting this dataset: https://openai.com/policies/usage-policies. |
aniketl07/test | ---
license: apache-2.0
---
|
cpalang/Methods2Test_CompleteContext | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: FocalMethod
dtype: string
- name: TestCase
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6361648462
num_examples: 624022
- name: test
num_bytes: 1663343888
num_examples: 156922
download_size: 508889186
dataset_size: 8024992350
---
|
arthurmluz/GPTextSum_data-wiki_gptextsum2_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 93872
num_examples: 20
download_size: 90986
dataset_size: 93872
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum_data-wiki-gptextsum_results"
rouge= {'rouge1': 0.4600676970614709, 'rouge2': 0.2024089594170197, 'rougeL': 0.28630530856939856, 'rougeLsum': 0.28630530856939856}
bert= {'precision': 0.7757186979055405, 'recall': 0.7327599436044693, 'f1': 0.7533363491296768} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.